Feb  2 04:00:39 np0005604791 kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Feb  2 04:00:39 np0005604791 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb  2 04:00:39 np0005604791 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb  2 04:00:39 np0005604791 kernel: BIOS-provided physical RAM map:
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb  2 04:00:39 np0005604791 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb  2 04:00:39 np0005604791 kernel: NX (Execute Disable) protection: active
Feb  2 04:00:39 np0005604791 kernel: APIC: Static calls initialized
Feb  2 04:00:39 np0005604791 kernel: SMBIOS 2.8 present.
Feb  2 04:00:39 np0005604791 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb  2 04:00:39 np0005604791 kernel: Hypervisor detected: KVM
Feb  2 04:00:39 np0005604791 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb  2 04:00:39 np0005604791 kernel: kvm-clock: using sched offset of 5598956451 cycles
Feb  2 04:00:39 np0005604791 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb  2 04:00:39 np0005604791 kernel: tsc: Detected 2800.000 MHz processor
Feb  2 04:00:39 np0005604791 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb  2 04:00:39 np0005604791 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb  2 04:00:39 np0005604791 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb  2 04:00:39 np0005604791 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb  2 04:00:39 np0005604791 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb  2 04:00:39 np0005604791 kernel: Using GB pages for direct mapping
Feb  2 04:00:39 np0005604791 kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Feb  2 04:00:39 np0005604791 kernel: ACPI: Early table checksum verification disabled
Feb  2 04:00:39 np0005604791 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb  2 04:00:39 np0005604791 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb  2 04:00:39 np0005604791 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb  2 04:00:39 np0005604791 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb  2 04:00:39 np0005604791 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb  2 04:00:39 np0005604791 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb  2 04:00:39 np0005604791 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb  2 04:00:39 np0005604791 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb  2 04:00:39 np0005604791 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb  2 04:00:39 np0005604791 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb  2 04:00:39 np0005604791 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb  2 04:00:39 np0005604791 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb  2 04:00:39 np0005604791 kernel: No NUMA configuration found
Feb  2 04:00:39 np0005604791 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb  2 04:00:39 np0005604791 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Feb  2 04:00:39 np0005604791 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb  2 04:00:39 np0005604791 kernel: Zone ranges:
Feb  2 04:00:39 np0005604791 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb  2 04:00:39 np0005604791 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb  2 04:00:39 np0005604791 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb  2 04:00:39 np0005604791 kernel:  Device   empty
Feb  2 04:00:39 np0005604791 kernel: Movable zone start for each node
Feb  2 04:00:39 np0005604791 kernel: Early memory node ranges
Feb  2 04:00:39 np0005604791 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb  2 04:00:39 np0005604791 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb  2 04:00:39 np0005604791 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb  2 04:00:39 np0005604791 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb  2 04:00:39 np0005604791 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb  2 04:00:39 np0005604791 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb  2 04:00:39 np0005604791 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb  2 04:00:39 np0005604791 kernel: ACPI: PM-Timer IO Port: 0x608
Feb  2 04:00:39 np0005604791 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb  2 04:00:39 np0005604791 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb  2 04:00:39 np0005604791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb  2 04:00:39 np0005604791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb  2 04:00:39 np0005604791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb  2 04:00:39 np0005604791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb  2 04:00:39 np0005604791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb  2 04:00:39 np0005604791 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb  2 04:00:39 np0005604791 kernel: TSC deadline timer available
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Max. logical packages:   8
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Max. logical dies:       8
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Max. dies per package:   1
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Max. threads per core:   1
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Num. cores per package:     1
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Num. threads per package:   1
Feb  2 04:00:39 np0005604791 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb  2 04:00:39 np0005604791 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb  2 04:00:39 np0005604791 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb  2 04:00:39 np0005604791 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb  2 04:00:39 np0005604791 kernel: Booting paravirtualized kernel on KVM
Feb  2 04:00:39 np0005604791 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb  2 04:00:39 np0005604791 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb  2 04:00:39 np0005604791 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb  2 04:00:39 np0005604791 kernel: kvm-guest: PV spinlocks disabled, no host support
Feb  2 04:00:39 np0005604791 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb  2 04:00:39 np0005604791 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Feb  2 04:00:39 np0005604791 kernel: random: crng init done
Feb  2 04:00:39 np0005604791 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: Fallback order for Node 0: 0 
Feb  2 04:00:39 np0005604791 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb  2 04:00:39 np0005604791 kernel: Policy zone: Normal
Feb  2 04:00:39 np0005604791 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb  2 04:00:39 np0005604791 kernel: software IO TLB: area num 8.
Feb  2 04:00:39 np0005604791 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb  2 04:00:39 np0005604791 kernel: ftrace: allocating 49438 entries in 194 pages
Feb  2 04:00:39 np0005604791 kernel: ftrace: allocated 194 pages with 3 groups
Feb  2 04:00:39 np0005604791 kernel: Dynamic Preempt: voluntary
Feb  2 04:00:39 np0005604791 kernel: rcu: Preemptible hierarchical RCU implementation.
Feb  2 04:00:39 np0005604791 kernel: rcu: #011RCU event tracing is enabled.
Feb  2 04:00:39 np0005604791 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb  2 04:00:39 np0005604791 kernel: #011Trampoline variant of Tasks RCU enabled.
Feb  2 04:00:39 np0005604791 kernel: #011Rude variant of Tasks RCU enabled.
Feb  2 04:00:39 np0005604791 kernel: #011Tracing variant of Tasks RCU enabled.
Feb  2 04:00:39 np0005604791 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb  2 04:00:39 np0005604791 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb  2 04:00:39 np0005604791 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb  2 04:00:39 np0005604791 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb  2 04:00:39 np0005604791 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb  2 04:00:39 np0005604791 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb  2 04:00:39 np0005604791 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb  2 04:00:39 np0005604791 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb  2 04:00:39 np0005604791 kernel: Console: colour VGA+ 80x25
Feb  2 04:00:39 np0005604791 kernel: printk: console [ttyS0] enabled
Feb  2 04:00:39 np0005604791 kernel: ACPI: Core revision 20230331
Feb  2 04:00:39 np0005604791 kernel: APIC: Switch to symmetric I/O mode setup
Feb  2 04:00:39 np0005604791 kernel: x2apic enabled
Feb  2 04:00:39 np0005604791 kernel: APIC: Switched APIC routing to: physical x2apic
Feb  2 04:00:39 np0005604791 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb  2 04:00:39 np0005604791 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb  2 04:00:39 np0005604791 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb  2 04:00:39 np0005604791 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb  2 04:00:39 np0005604791 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb  2 04:00:39 np0005604791 kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb  2 04:00:39 np0005604791 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb  2 04:00:39 np0005604791 kernel: Spectre V2 : Mitigation: Retpolines
Feb  2 04:00:39 np0005604791 kernel: RETBleed: Mitigation: untrained return thunk
Feb  2 04:00:39 np0005604791 kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb  2 04:00:39 np0005604791 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb  2 04:00:39 np0005604791 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb  2 04:00:39 np0005604791 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb  2 04:00:39 np0005604791 kernel: active return thunk: retbleed_return_thunk
Feb  2 04:00:39 np0005604791 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb  2 04:00:39 np0005604791 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb  2 04:00:39 np0005604791 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb  2 04:00:39 np0005604791 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb  2 04:00:39 np0005604791 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb  2 04:00:39 np0005604791 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb  2 04:00:39 np0005604791 kernel: Freeing SMP alternatives memory: 40K
Feb  2 04:00:39 np0005604791 kernel: pid_max: default: 32768 minimum: 301
Feb  2 04:00:39 np0005604791 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb  2 04:00:39 np0005604791 kernel: landlock: Up and running.
Feb  2 04:00:39 np0005604791 kernel: Yama: becoming mindful.
Feb  2 04:00:39 np0005604791 kernel: SELinux:  Initializing.
Feb  2 04:00:39 np0005604791 kernel: LSM support for eBPF active
Feb  2 04:00:39 np0005604791 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb  2 04:00:39 np0005604791 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb  2 04:00:39 np0005604791 kernel: ... version:                0
Feb  2 04:00:39 np0005604791 kernel: ... bit width:              48
Feb  2 04:00:39 np0005604791 kernel: ... generic registers:      6
Feb  2 04:00:39 np0005604791 kernel: ... value mask:             0000ffffffffffff
Feb  2 04:00:39 np0005604791 kernel: ... max period:             00007fffffffffff
Feb  2 04:00:39 np0005604791 kernel: ... fixed-purpose events:   0
Feb  2 04:00:39 np0005604791 kernel: ... event mask:             000000000000003f
Feb  2 04:00:39 np0005604791 kernel: signal: max sigframe size: 1776
Feb  2 04:00:39 np0005604791 kernel: rcu: Hierarchical SRCU implementation.
Feb  2 04:00:39 np0005604791 kernel: rcu: #011Max phase no-delay instances is 400.
Feb  2 04:00:39 np0005604791 kernel: smp: Bringing up secondary CPUs ...
Feb  2 04:00:39 np0005604791 kernel: smpboot: x86: Booting SMP configuration:
Feb  2 04:00:39 np0005604791 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb  2 04:00:39 np0005604791 kernel: smp: Brought up 1 node, 8 CPUs
Feb  2 04:00:39 np0005604791 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb  2 04:00:39 np0005604791 kernel: node 0 deferred pages initialised in 10ms
Feb  2 04:00:39 np0005604791 kernel: Memory: 7763692K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618408K reserved, 0K cma-reserved)
Feb  2 04:00:39 np0005604791 kernel: devtmpfs: initialized
Feb  2 04:00:39 np0005604791 kernel: x86/mm: Memory block size: 128MB
Feb  2 04:00:39 np0005604791 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb  2 04:00:39 np0005604791 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb  2 04:00:39 np0005604791 kernel: pinctrl core: initialized pinctrl subsystem
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb  2 04:00:39 np0005604791 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb  2 04:00:39 np0005604791 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb  2 04:00:39 np0005604791 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb  2 04:00:39 np0005604791 kernel: audit: initializing netlink subsys (disabled)
Feb  2 04:00:39 np0005604791 kernel: audit: type=2000 audit(1770022838.682:1): state=initialized audit_enabled=0 res=1
Feb  2 04:00:39 np0005604791 kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb  2 04:00:39 np0005604791 kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb  2 04:00:39 np0005604791 kernel: thermal_sys: Registered thermal governor 'user_space'
Feb  2 04:00:39 np0005604791 kernel: cpuidle: using governor menu
Feb  2 04:00:39 np0005604791 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb  2 04:00:39 np0005604791 kernel: PCI: Using configuration type 1 for base access
Feb  2 04:00:39 np0005604791 kernel: PCI: Using configuration type 1 for extended access
Feb  2 04:00:39 np0005604791 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb  2 04:00:39 np0005604791 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb  2 04:00:39 np0005604791 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb  2 04:00:39 np0005604791 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb  2 04:00:39 np0005604791 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb  2 04:00:39 np0005604791 kernel: Demotion targets for Node 0: null
Feb  2 04:00:39 np0005604791 kernel: cryptd: max_cpu_qlen set to 1000
Feb  2 04:00:39 np0005604791 kernel: ACPI: Added _OSI(Module Device)
Feb  2 04:00:39 np0005604791 kernel: ACPI: Added _OSI(Processor Device)
Feb  2 04:00:39 np0005604791 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb  2 04:00:39 np0005604791 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb  2 04:00:39 np0005604791 kernel: ACPI: Interpreter enabled
Feb  2 04:00:39 np0005604791 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb  2 04:00:39 np0005604791 kernel: ACPI: Using IOAPIC for interrupt routing
Feb  2 04:00:39 np0005604791 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb  2 04:00:39 np0005604791 kernel: PCI: Using E820 reservations for host bridge windows
Feb  2 04:00:39 np0005604791 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb  2 04:00:39 np0005604791 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [3] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [4] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [5] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [6] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [7] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [8] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [9] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [10] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [11] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [12] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [13] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [14] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [15] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [16] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [17] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [18] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [19] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [20] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [21] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [22] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [23] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [24] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [25] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [26] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [27] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [28] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [29] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [30] registered
Feb  2 04:00:39 np0005604791 kernel: acpiphp: Slot [31] registered
Feb  2 04:00:39 np0005604791 kernel: PCI host bridge to bus 0000:00
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb  2 04:00:39 np0005604791 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb  2 04:00:39 np0005604791 kernel: iommu: Default domain type: Translated
Feb  2 04:00:39 np0005604791 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb  2 04:00:39 np0005604791 kernel: SCSI subsystem initialized
Feb  2 04:00:39 np0005604791 kernel: ACPI: bus type USB registered
Feb  2 04:00:39 np0005604791 kernel: usbcore: registered new interface driver usbfs
Feb  2 04:00:39 np0005604791 kernel: usbcore: registered new interface driver hub
Feb  2 04:00:39 np0005604791 kernel: usbcore: registered new device driver usb
Feb  2 04:00:39 np0005604791 kernel: pps_core: LinuxPPS API ver. 1 registered
Feb  2 04:00:39 np0005604791 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb  2 04:00:39 np0005604791 kernel: PTP clock support registered
Feb  2 04:00:39 np0005604791 kernel: EDAC MC: Ver: 3.0.0
Feb  2 04:00:39 np0005604791 kernel: NetLabel: Initializing
Feb  2 04:00:39 np0005604791 kernel: NetLabel:  domain hash size = 128
Feb  2 04:00:39 np0005604791 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb  2 04:00:39 np0005604791 kernel: NetLabel:  unlabeled traffic allowed by default
Feb  2 04:00:39 np0005604791 kernel: PCI: Using ACPI for IRQ routing
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb  2 04:00:39 np0005604791 kernel: vgaarb: loaded
Feb  2 04:00:39 np0005604791 kernel: clocksource: Switched to clocksource kvm-clock
Feb  2 04:00:39 np0005604791 kernel: VFS: Disk quotas dquot_6.6.0
Feb  2 04:00:39 np0005604791 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb  2 04:00:39 np0005604791 kernel: pnp: PnP ACPI init
Feb  2 04:00:39 np0005604791 kernel: pnp: PnP ACPI: found 5 devices
Feb  2 04:00:39 np0005604791 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_INET protocol family
Feb  2 04:00:39 np0005604791 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb  2 04:00:39 np0005604791 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_XDP protocol family
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb  2 04:00:39 np0005604791 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb  2 04:00:39 np0005604791 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb  2 04:00:39 np0005604791 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 39780 usecs
Feb  2 04:00:39 np0005604791 kernel: PCI: CLS 0 bytes, default 64
Feb  2 04:00:39 np0005604791 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb  2 04:00:39 np0005604791 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb  2 04:00:39 np0005604791 kernel: ACPI: bus type thunderbolt registered
Feb  2 04:00:39 np0005604791 kernel: Trying to unpack rootfs image as initramfs...
Feb  2 04:00:39 np0005604791 kernel: Initialise system trusted keyrings
Feb  2 04:00:39 np0005604791 kernel: Key type blacklist registered
Feb  2 04:00:39 np0005604791 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb  2 04:00:39 np0005604791 kernel: zbud: loaded
Feb  2 04:00:39 np0005604791 kernel: integrity: Platform Keyring initialized
Feb  2 04:00:39 np0005604791 kernel: integrity: Machine keyring initialized
Feb  2 04:00:39 np0005604791 kernel: Freeing initrd memory: 88000K
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_ALG protocol family
Feb  2 04:00:39 np0005604791 kernel: xor: automatically using best checksumming function   avx       
Feb  2 04:00:39 np0005604791 kernel: Key type asymmetric registered
Feb  2 04:00:39 np0005604791 kernel: Asymmetric key parser 'x509' registered
Feb  2 04:00:39 np0005604791 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb  2 04:00:39 np0005604791 kernel: io scheduler mq-deadline registered
Feb  2 04:00:39 np0005604791 kernel: io scheduler kyber registered
Feb  2 04:00:39 np0005604791 kernel: io scheduler bfq registered
Feb  2 04:00:39 np0005604791 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb  2 04:00:39 np0005604791 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb  2 04:00:39 np0005604791 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb  2 04:00:39 np0005604791 kernel: ACPI: button: Power Button [PWRF]
Feb  2 04:00:39 np0005604791 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb  2 04:00:39 np0005604791 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb  2 04:00:39 np0005604791 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb  2 04:00:39 np0005604791 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb  2 04:00:39 np0005604791 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb  2 04:00:39 np0005604791 kernel: Non-volatile memory driver v1.3
Feb  2 04:00:39 np0005604791 kernel: rdac: device handler registered
Feb  2 04:00:39 np0005604791 kernel: hp_sw: device handler registered
Feb  2 04:00:39 np0005604791 kernel: emc: device handler registered
Feb  2 04:00:39 np0005604791 kernel: alua: device handler registered
Feb  2 04:00:39 np0005604791 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb  2 04:00:39 np0005604791 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb  2 04:00:39 np0005604791 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb  2 04:00:39 np0005604791 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb  2 04:00:39 np0005604791 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb  2 04:00:39 np0005604791 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb  2 04:00:39 np0005604791 kernel: usb usb1: Product: UHCI Host Controller
Feb  2 04:00:39 np0005604791 kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Feb  2 04:00:39 np0005604791 kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb  2 04:00:39 np0005604791 kernel: hub 1-0:1.0: USB hub found
Feb  2 04:00:39 np0005604791 kernel: hub 1-0:1.0: 2 ports detected
Feb  2 04:00:39 np0005604791 kernel: usbcore: registered new interface driver usbserial_generic
Feb  2 04:00:39 np0005604791 kernel: usbserial: USB Serial support registered for generic
Feb  2 04:00:39 np0005604791 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb  2 04:00:39 np0005604791 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb  2 04:00:39 np0005604791 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb  2 04:00:39 np0005604791 kernel: mousedev: PS/2 mouse device common for all mice
Feb  2 04:00:39 np0005604791 kernel: rtc_cmos 00:04: RTC can wake from S4
Feb  2 04:00:39 np0005604791 kernel: rtc_cmos 00:04: registered as rtc0
Feb  2 04:00:39 np0005604791 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb  2 04:00:39 np0005604791 kernel: rtc_cmos 00:04: setting system clock to 2026-02-02T09:00:38 UTC (1770022838)
Feb  2 04:00:39 np0005604791 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb  2 04:00:39 np0005604791 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb  2 04:00:39 np0005604791 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb  2 04:00:39 np0005604791 kernel: hid: raw HID events driver (C) Jiri Kosina
Feb  2 04:00:39 np0005604791 kernel: usbcore: registered new interface driver usbhid
Feb  2 04:00:39 np0005604791 kernel: usbhid: USB HID core driver
Feb  2 04:00:39 np0005604791 kernel: drop_monitor: Initializing network drop monitor service
Feb  2 04:00:39 np0005604791 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb  2 04:00:39 np0005604791 kernel: Initializing XFRM netlink socket
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_INET6 protocol family
Feb  2 04:00:39 np0005604791 kernel: Segment Routing with IPv6
Feb  2 04:00:39 np0005604791 kernel: NET: Registered PF_PACKET protocol family
Feb  2 04:00:39 np0005604791 kernel: mpls_gso: MPLS GSO support
Feb  2 04:00:39 np0005604791 kernel: IPI shorthand broadcast: enabled
Feb  2 04:00:39 np0005604791 kernel: AVX2 version of gcm_enc/dec engaged.
Feb  2 04:00:39 np0005604791 kernel: AES CTR mode by8 optimization enabled
Feb  2 04:00:39 np0005604791 kernel: sched_clock: Marking stable (899001730, 144895020)->(1138477630, -94580880)
Feb  2 04:00:39 np0005604791 kernel: registered taskstats version 1
Feb  2 04:00:39 np0005604791 kernel: Loading compiled-in X.509 certificates
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb  2 04:00:39 np0005604791 kernel: Demotion targets for Node 0: null
Feb  2 04:00:39 np0005604791 kernel: page_owner is disabled
Feb  2 04:00:39 np0005604791 kernel: Key type .fscrypt registered
Feb  2 04:00:39 np0005604791 kernel: Key type fscrypt-provisioning registered
Feb  2 04:00:39 np0005604791 kernel: Key type big_key registered
Feb  2 04:00:39 np0005604791 kernel: Key type encrypted registered
Feb  2 04:00:39 np0005604791 kernel: ima: No TPM chip found, activating TPM-bypass!
Feb  2 04:00:39 np0005604791 kernel: Loading compiled-in module X.509 certificates
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Feb  2 04:00:39 np0005604791 kernel: ima: Allocated hash algorithm: sha256
Feb  2 04:00:39 np0005604791 kernel: ima: No architecture policies found
Feb  2 04:00:39 np0005604791 kernel: evm: Initialising EVM extended attributes:
Feb  2 04:00:39 np0005604791 kernel: evm: security.selinux
Feb  2 04:00:39 np0005604791 kernel: evm: security.SMACK64 (disabled)
Feb  2 04:00:39 np0005604791 kernel: evm: security.SMACK64EXEC (disabled)
Feb  2 04:00:39 np0005604791 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb  2 04:00:39 np0005604791 kernel: evm: security.SMACK64MMAP (disabled)
Feb  2 04:00:39 np0005604791 kernel: evm: security.apparmor (disabled)
Feb  2 04:00:39 np0005604791 kernel: evm: security.ima
Feb  2 04:00:39 np0005604791 kernel: evm: security.capability
Feb  2 04:00:39 np0005604791 kernel: evm: HMAC attrs: 0x1
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb  2 04:00:39 np0005604791 kernel: Running certificate verification RSA selftest
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb  2 04:00:39 np0005604791 kernel: Running certificate verification ECDSA selftest
Feb  2 04:00:39 np0005604791 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb  2 04:00:39 np0005604791 kernel: clk: Disabling unused clocks
Feb  2 04:00:39 np0005604791 kernel: Freeing unused decrypted memory: 2028K
Feb  2 04:00:39 np0005604791 kernel: Freeing unused kernel image (initmem) memory: 4196K
Feb  2 04:00:39 np0005604791 kernel: Write protecting the kernel read-only data: 30720k
Feb  2 04:00:39 np0005604791 kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Feb  2 04:00:39 np0005604791 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb  2 04:00:39 np0005604791 kernel: Run /init as init process
Feb  2 04:00:39 np0005604791 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb  2 04:00:39 np0005604791 systemd: Detected virtualization kvm.
Feb  2 04:00:39 np0005604791 systemd: Detected architecture x86-64.
Feb  2 04:00:39 np0005604791 systemd: Running in initrd.
Feb  2 04:00:39 np0005604791 systemd: No hostname configured, using default hostname.
Feb  2 04:00:39 np0005604791 systemd: Hostname set to <localhost>.
Feb  2 04:00:39 np0005604791 systemd: Initializing machine ID from VM UUID.
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: Product: QEMU USB Tablet
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: Manufacturer: QEMU
Feb  2 04:00:39 np0005604791 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb  2 04:00:39 np0005604791 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb  2 04:00:39 np0005604791 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb  2 04:00:39 np0005604791 systemd: Queued start job for default target Initrd Default Target.
Feb  2 04:00:39 np0005604791 systemd: Started Dispatch Password Requests to Console Directory Watch.
Feb  2 04:00:39 np0005604791 systemd: Reached target Local Encrypted Volumes.
Feb  2 04:00:39 np0005604791 systemd: Reached target Initrd /usr File System.
Feb  2 04:00:39 np0005604791 systemd: Reached target Local File Systems.
Feb  2 04:00:39 np0005604791 systemd: Reached target Path Units.
Feb  2 04:00:39 np0005604791 systemd: Reached target Slice Units.
Feb  2 04:00:39 np0005604791 systemd: Reached target Swaps.
Feb  2 04:00:39 np0005604791 systemd: Reached target Timer Units.
Feb  2 04:00:39 np0005604791 systemd: Listening on D-Bus System Message Bus Socket.
Feb  2 04:00:39 np0005604791 systemd: Listening on Journal Socket (/dev/log).
Feb  2 04:00:39 np0005604791 systemd: Listening on Journal Socket.
Feb  2 04:00:39 np0005604791 systemd: Listening on udev Control Socket.
Feb  2 04:00:39 np0005604791 systemd: Listening on udev Kernel Socket.
Feb  2 04:00:39 np0005604791 systemd: Reached target Socket Units.
Feb  2 04:00:39 np0005604791 systemd: Starting Create List of Static Device Nodes...
Feb  2 04:00:39 np0005604791 systemd: Starting Journal Service...
Feb  2 04:00:39 np0005604791 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb  2 04:00:39 np0005604791 systemd: Starting Apply Kernel Variables...
Feb  2 04:00:39 np0005604791 systemd: Starting Create System Users...
Feb  2 04:00:39 np0005604791 systemd: Starting Setup Virtual Console...
Feb  2 04:00:39 np0005604791 systemd: Finished Create List of Static Device Nodes.
Feb  2 04:00:39 np0005604791 systemd: Finished Apply Kernel Variables.
Feb  2 04:00:39 np0005604791 systemd: Finished Create System Users.
Feb  2 04:00:39 np0005604791 systemd-journald[306]: Journal started
Feb  2 04:00:39 np0005604791 systemd-journald[306]: Runtime Journal (/run/log/journal/7f778d97f318438087762e4d99e5fd86) is 8.0M, max 153.6M, 145.6M free.
Feb  2 04:00:39 np0005604791 systemd-sysusers[310]: Creating group 'users' with GID 100.
Feb  2 04:00:39 np0005604791 systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Feb  2 04:00:39 np0005604791 systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb  2 04:00:39 np0005604791 systemd: Started Journal Service.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting Create Static Device Nodes in /dev...
Feb  2 04:00:39 np0005604791 systemd[1]: Starting Create Volatile Files and Directories...
Feb  2 04:00:39 np0005604791 systemd[1]: Finished Create Static Device Nodes in /dev.
Feb  2 04:00:39 np0005604791 systemd[1]: Finished Create Volatile Files and Directories.
Feb  2 04:00:39 np0005604791 systemd[1]: Finished Setup Virtual Console.
Feb  2 04:00:39 np0005604791 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting dracut cmdline hook...
Feb  2 04:00:39 np0005604791 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Feb  2 04:00:39 np0005604791 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb  2 04:00:39 np0005604791 systemd[1]: Finished dracut cmdline hook.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting dracut pre-udev hook...
Feb  2 04:00:39 np0005604791 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb  2 04:00:39 np0005604791 kernel: device-mapper: uevent: version 1.0.3
Feb  2 04:00:39 np0005604791 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb  2 04:00:39 np0005604791 kernel: RPC: Registered named UNIX socket transport module.
Feb  2 04:00:39 np0005604791 kernel: RPC: Registered udp transport module.
Feb  2 04:00:39 np0005604791 kernel: RPC: Registered tcp transport module.
Feb  2 04:00:39 np0005604791 kernel: RPC: Registered tcp-with-tls transport module.
Feb  2 04:00:39 np0005604791 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb  2 04:00:39 np0005604791 rpc.statd[442]: Version 2.5.4 starting
Feb  2 04:00:39 np0005604791 rpc.statd[442]: Initializing NSM state
Feb  2 04:00:39 np0005604791 rpc.idmapd[447]: Setting log level to 0
Feb  2 04:00:39 np0005604791 systemd[1]: Finished dracut pre-udev hook.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb  2 04:00:39 np0005604791 systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Feb  2 04:00:39 np0005604791 systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting dracut pre-trigger hook...
Feb  2 04:00:39 np0005604791 systemd[1]: Finished dracut pre-trigger hook.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting Coldplug All udev Devices...
Feb  2 04:00:39 np0005604791 systemd[1]: Created slice Slice /system/modprobe.
Feb  2 04:00:39 np0005604791 systemd[1]: Starting Load Kernel Module configfs...
Feb  2 04:00:39 np0005604791 systemd[1]: Finished Coldplug All udev Devices.
Feb  2 04:00:39 np0005604791 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb  2 04:00:39 np0005604791 systemd[1]: Finished Load Kernel Module configfs.
Feb  2 04:00:39 np0005604791 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb  2 04:00:39 np0005604791 systemd[1]: Reached target Network.
Feb  2 04:00:39 np0005604791 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb  2 04:00:39 np0005604791 systemd[1]: Starting dracut initqueue hook...
Feb  2 04:00:39 np0005604791 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb  2 04:00:39 np0005604791 kernel: scsi host0: ata_piix
Feb  2 04:00:39 np0005604791 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb  2 04:00:39 np0005604791 kernel: scsi host1: ata_piix
Feb  2 04:00:39 np0005604791 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb  2 04:00:39 np0005604791 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb  2 04:00:39 np0005604791 systemd-udevd[496]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:00:39 np0005604791 kernel: vda: vda1
Feb  2 04:00:40 np0005604791 systemd[1]: Mounting Kernel Configuration File System...
Feb  2 04:00:40 np0005604791 kernel: ata1: found unknown device (class 0)
Feb  2 04:00:40 np0005604791 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb  2 04:00:40 np0005604791 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb  2 04:00:40 np0005604791 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb  2 04:00:40 np0005604791 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb  2 04:00:40 np0005604791 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb  2 04:00:40 np0005604791 systemd[1]: Mounted Kernel Configuration File System.
Feb  2 04:00:40 np0005604791 systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Initrd Root Device.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target System Initialization.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Basic System.
Feb  2 04:00:40 np0005604791 systemd[1]: Finished dracut initqueue hook.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Preparation for Remote File Systems.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Remote Encrypted Volumes.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Remote File Systems.
Feb  2 04:00:40 np0005604791 systemd[1]: Starting dracut pre-mount hook...
Feb  2 04:00:40 np0005604791 systemd[1]: Finished dracut pre-mount hook.
Feb  2 04:00:40 np0005604791 systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Feb  2 04:00:40 np0005604791 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Feb  2 04:00:40 np0005604791 systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Feb  2 04:00:40 np0005604791 systemd[1]: Mounting /sysroot...
Feb  2 04:00:40 np0005604791 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb  2 04:00:40 np0005604791 kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Feb  2 04:00:40 np0005604791 kernel: XFS (vda1): Ending clean mount
Feb  2 04:00:40 np0005604791 systemd[1]: Mounted /sysroot.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Initrd Root File System.
Feb  2 04:00:40 np0005604791 systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb  2 04:00:40 np0005604791 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb  2 04:00:40 np0005604791 systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Initrd File Systems.
Feb  2 04:00:40 np0005604791 systemd[1]: Reached target Initrd Default Target.
Feb  2 04:00:40 np0005604791 systemd[1]: Starting dracut mount hook...
Feb  2 04:00:41 np0005604791 systemd[1]: Finished dracut mount hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb  2 04:00:41 np0005604791 rpc.idmapd[447]: exiting on signal 15
Feb  2 04:00:41 np0005604791 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Network.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Remote Encrypted Volumes.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Timer Units.
Feb  2 04:00:41 np0005604791 systemd[1]: dbus.socket: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Closed D-Bus System Message Bus Socket.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Initrd Default Target.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Basic System.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Initrd Root Device.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Initrd /usr File System.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Path Units.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Remote File Systems.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Preparation for Remote File Systems.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Slice Units.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Socket Units.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target System Initialization.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Local File Systems.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Swaps.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-mount.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut mount hook.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut pre-mount hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped target Local Encrypted Volumes.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut initqueue hook.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Apply Kernel Variables.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Create Volatile Files and Directories.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Coldplug All udev Devices.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut pre-trigger hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Setup Virtual Console.
Feb  2 04:00:41 np0005604791 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Closed udev Control Socket.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Closed udev Kernel Socket.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut pre-udev hook.
Feb  2 04:00:41 np0005604791 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped dracut cmdline hook.
Feb  2 04:00:41 np0005604791 systemd[1]: Starting Cleanup udev Database...
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb  2 04:00:41 np0005604791 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Create List of Static Device Nodes.
Feb  2 04:00:41 np0005604791 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Stopped Create System Users.
Feb  2 04:00:41 np0005604791 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb  2 04:00:41 np0005604791 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb  2 04:00:41 np0005604791 systemd[1]: Finished Cleanup udev Database.
Feb  2 04:00:41 np0005604791 systemd[1]: Reached target Switch Root.
Feb  2 04:00:41 np0005604791 systemd[1]: Starting Switch Root...
Feb  2 04:00:41 np0005604791 systemd[1]: Switching root.
Feb  2 04:00:41 np0005604791 systemd-journald[306]: Journal stopped
Feb  2 04:00:42 np0005604791 systemd-journald: Received SIGTERM from PID 1 (systemd).
Feb  2 04:00:42 np0005604791 kernel: audit: type=1404 audit(1770022841.496:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:00:42 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:00:42 np0005604791 kernel: audit: type=1403 audit(1770022841.616:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb  2 04:00:42 np0005604791 systemd: Successfully loaded SELinux policy in 123.354ms.
Feb  2 04:00:42 np0005604791 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.796ms.
Feb  2 04:00:42 np0005604791 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb  2 04:00:42 np0005604791 systemd: Detected virtualization kvm.
Feb  2 04:00:42 np0005604791 systemd: Detected architecture x86-64.
Feb  2 04:00:42 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:00:42 np0005604791 systemd: initrd-switch-root.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd: Stopped Switch Root.
Feb  2 04:00:42 np0005604791 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb  2 04:00:42 np0005604791 systemd: Created slice Slice /system/getty.
Feb  2 04:00:42 np0005604791 systemd: Created slice Slice /system/serial-getty.
Feb  2 04:00:42 np0005604791 systemd: Created slice Slice /system/sshd-keygen.
Feb  2 04:00:42 np0005604791 systemd: Created slice User and Session Slice.
Feb  2 04:00:42 np0005604791 systemd: Started Dispatch Password Requests to Console Directory Watch.
Feb  2 04:00:42 np0005604791 systemd: Started Forward Password Requests to Wall Directory Watch.
Feb  2 04:00:42 np0005604791 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb  2 04:00:42 np0005604791 systemd: Reached target Local Encrypted Volumes.
Feb  2 04:00:42 np0005604791 systemd: Stopped target Switch Root.
Feb  2 04:00:42 np0005604791 systemd: Stopped target Initrd File Systems.
Feb  2 04:00:42 np0005604791 systemd: Stopped target Initrd Root File System.
Feb  2 04:00:42 np0005604791 systemd: Reached target Local Integrity Protected Volumes.
Feb  2 04:00:42 np0005604791 systemd: Reached target Path Units.
Feb  2 04:00:42 np0005604791 systemd: Reached target rpc_pipefs.target.
Feb  2 04:00:42 np0005604791 systemd: Reached target Slice Units.
Feb  2 04:00:42 np0005604791 systemd: Reached target Swaps.
Feb  2 04:00:42 np0005604791 systemd: Reached target Local Verity Protected Volumes.
Feb  2 04:00:42 np0005604791 systemd: Listening on RPCbind Server Activation Socket.
Feb  2 04:00:42 np0005604791 systemd: Reached target RPC Port Mapper.
Feb  2 04:00:42 np0005604791 systemd: Listening on Process Core Dump Socket.
Feb  2 04:00:42 np0005604791 systemd: Listening on initctl Compatibility Named Pipe.
Feb  2 04:00:42 np0005604791 systemd: Listening on udev Control Socket.
Feb  2 04:00:42 np0005604791 systemd: Listening on udev Kernel Socket.
Feb  2 04:00:42 np0005604791 systemd: Mounting Huge Pages File System...
Feb  2 04:00:42 np0005604791 systemd: Mounting POSIX Message Queue File System...
Feb  2 04:00:42 np0005604791 systemd: Mounting Kernel Debug File System...
Feb  2 04:00:42 np0005604791 systemd: Mounting Kernel Trace File System...
Feb  2 04:00:42 np0005604791 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb  2 04:00:42 np0005604791 systemd: Starting Create List of Static Device Nodes...
Feb  2 04:00:42 np0005604791 systemd: Starting Load Kernel Module configfs...
Feb  2 04:00:42 np0005604791 systemd: Starting Load Kernel Module drm...
Feb  2 04:00:42 np0005604791 systemd: Starting Load Kernel Module efi_pstore...
Feb  2 04:00:42 np0005604791 systemd: Starting Load Kernel Module fuse...
Feb  2 04:00:42 np0005604791 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb  2 04:00:42 np0005604791 systemd: systemd-fsck-root.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd: Stopped File System Check on Root Device.
Feb  2 04:00:42 np0005604791 systemd: Stopped Journal Service.
Feb  2 04:00:42 np0005604791 systemd: Starting Journal Service...
Feb  2 04:00:42 np0005604791 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb  2 04:00:42 np0005604791 systemd: Starting Generate network units from Kernel command line...
Feb  2 04:00:42 np0005604791 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb  2 04:00:42 np0005604791 systemd: Starting Remount Root and Kernel File Systems...
Feb  2 04:00:42 np0005604791 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb  2 04:00:42 np0005604791 systemd: Starting Apply Kernel Variables...
Feb  2 04:00:42 np0005604791 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb  2 04:00:42 np0005604791 systemd: Starting Coldplug All udev Devices...
Feb  2 04:00:42 np0005604791 kernel: fuse: init (API version 7.37)
Feb  2 04:00:42 np0005604791 systemd: Mounted Huge Pages File System.
Feb  2 04:00:42 np0005604791 systemd: Mounted POSIX Message Queue File System.
Feb  2 04:00:42 np0005604791 systemd: Mounted Kernel Debug File System.
Feb  2 04:00:42 np0005604791 systemd: Mounted Kernel Trace File System.
Feb  2 04:00:42 np0005604791 systemd: Finished Create List of Static Device Nodes.
Feb  2 04:00:42 np0005604791 systemd-journald[680]: Journal started
Feb  2 04:00:42 np0005604791 systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Feb  2 04:00:42 np0005604791 systemd: modprobe@configfs.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd[1]: Queued start job for default target Multi-User System.
Feb  2 04:00:42 np0005604791 systemd[1]: systemd-journald.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd: Finished Load Kernel Module configfs.
Feb  2 04:00:42 np0005604791 systemd: Started Journal Service.
Feb  2 04:00:42 np0005604791 kernel: ACPI: bus type drm_connector registered
Feb  2 04:00:42 np0005604791 systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Load Kernel Module drm.
Feb  2 04:00:42 np0005604791 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Load Kernel Module efi_pstore.
Feb  2 04:00:42 np0005604791 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Load Kernel Module fuse.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Generate network units from Kernel command line.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Remount Root and Kernel File Systems.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Apply Kernel Variables.
Feb  2 04:00:42 np0005604791 systemd[1]: Mounting FUSE Control File System...
Feb  2 04:00:42 np0005604791 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Rebuild Hardware Database...
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Flush Journal to Persistent Storage...
Feb  2 04:00:42 np0005604791 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Load/Save OS Random Seed...
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Create System Users...
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Coldplug All udev Devices.
Feb  2 04:00:42 np0005604791 systemd[1]: Mounted FUSE Control File System.
Feb  2 04:00:42 np0005604791 systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Feb  2 04:00:42 np0005604791 systemd-journald[680]: Received client request to flush runtime journal.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Flush Journal to Persistent Storage.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Load/Save OS Random Seed.
Feb  2 04:00:42 np0005604791 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Create System Users.
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Create Static Device Nodes in /dev...
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Create Static Device Nodes in /dev.
Feb  2 04:00:42 np0005604791 systemd[1]: Reached target Preparation for Local File Systems.
Feb  2 04:00:42 np0005604791 systemd[1]: Reached target Local File Systems.
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb  2 04:00:42 np0005604791 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb  2 04:00:42 np0005604791 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb  2 04:00:42 np0005604791 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Automatic Boot Loader Update...
Feb  2 04:00:42 np0005604791 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Create Volatile Files and Directories...
Feb  2 04:00:42 np0005604791 bootctl[700]: Couldn't find EFI system partition, skipping.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Automatic Boot Loader Update.
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Create Volatile Files and Directories.
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Security Auditing Service...
Feb  2 04:00:42 np0005604791 systemd[1]: Starting RPC Bind...
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Rebuild Journal Catalog...
Feb  2 04:00:42 np0005604791 auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb  2 04:00:42 np0005604791 auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Rebuild Journal Catalog.
Feb  2 04:00:42 np0005604791 systemd[1]: Started RPC Bind.
Feb  2 04:00:42 np0005604791 augenrules[711]: /sbin/augenrules: No change
Feb  2 04:00:42 np0005604791 augenrules[726]: No rules
Feb  2 04:00:42 np0005604791 augenrules[726]: enabled 1
Feb  2 04:00:42 np0005604791 augenrules[726]: failure 1
Feb  2 04:00:42 np0005604791 augenrules[726]: pid 706
Feb  2 04:00:42 np0005604791 augenrules[726]: rate_limit 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_limit 8192
Feb  2 04:00:42 np0005604791 augenrules[726]: lost 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog 3
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time 60000
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time_actual 0
Feb  2 04:00:42 np0005604791 augenrules[726]: enabled 1
Feb  2 04:00:42 np0005604791 augenrules[726]: failure 1
Feb  2 04:00:42 np0005604791 augenrules[726]: pid 706
Feb  2 04:00:42 np0005604791 augenrules[726]: rate_limit 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_limit 8192
Feb  2 04:00:42 np0005604791 augenrules[726]: lost 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog 2
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time 60000
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time_actual 0
Feb  2 04:00:42 np0005604791 augenrules[726]: enabled 1
Feb  2 04:00:42 np0005604791 augenrules[726]: failure 1
Feb  2 04:00:42 np0005604791 augenrules[726]: pid 706
Feb  2 04:00:42 np0005604791 augenrules[726]: rate_limit 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_limit 8192
Feb  2 04:00:42 np0005604791 augenrules[726]: lost 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog 0
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time 60000
Feb  2 04:00:42 np0005604791 augenrules[726]: backlog_wait_time_actual 0
Feb  2 04:00:42 np0005604791 systemd[1]: Started Security Auditing Service.
Feb  2 04:00:42 np0005604791 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb  2 04:00:42 np0005604791 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb  2 04:00:43 np0005604791 systemd[1]: Finished Rebuild Hardware Database.
Feb  2 04:00:43 np0005604791 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb  2 04:00:43 np0005604791 systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Feb  2 04:00:43 np0005604791 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb  2 04:00:43 np0005604791 systemd[1]: Starting Update is Completed...
Feb  2 04:00:43 np0005604791 systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb  2 04:00:43 np0005604791 systemd[1]: Starting Load Kernel Module configfs...
Feb  2 04:00:43 np0005604791 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb  2 04:00:43 np0005604791 systemd[1]: Finished Load Kernel Module configfs.
Feb  2 04:00:43 np0005604791 systemd[1]: Finished Update is Completed.
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target System Initialization.
Feb  2 04:00:43 np0005604791 systemd[1]: Started dnf makecache --timer.
Feb  2 04:00:43 np0005604791 systemd[1]: Started Daily rotation of log files.
Feb  2 04:00:43 np0005604791 systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target Timer Units.
Feb  2 04:00:43 np0005604791 systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb  2 04:00:43 np0005604791 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target Socket Units.
Feb  2 04:00:43 np0005604791 systemd[1]: Starting D-Bus System Message Bus...
Feb  2 04:00:43 np0005604791 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb  2 04:00:43 np0005604791 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb  2 04:00:43 np0005604791 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:00:43 np0005604791 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb  2 04:00:43 np0005604791 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb  2 04:00:43 np0005604791 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb  2 04:00:43 np0005604791 systemd[1]: Started D-Bus System Message Bus.
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target Basic System.
Feb  2 04:00:43 np0005604791 dbus-broker-lau[775]: Ready
Feb  2 04:00:43 np0005604791 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb  2 04:00:43 np0005604791 systemd[1]: Starting NTP client/server...
Feb  2 04:00:43 np0005604791 kernel: kvm_amd: TSC scaling supported
Feb  2 04:00:43 np0005604791 kernel: kvm_amd: Nested Virtualization enabled
Feb  2 04:00:43 np0005604791 kernel: kvm_amd: Nested Paging enabled
Feb  2 04:00:43 np0005604791 kernel: kvm_amd: LBR virtualization supported
Feb  2 04:00:43 np0005604791 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb  2 04:00:43 np0005604791 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb  2 04:00:43 np0005604791 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb  2 04:00:43 np0005604791 kernel: Console: switching to colour dummy device 80x25
Feb  2 04:00:43 np0005604791 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb  2 04:00:43 np0005604791 kernel: [drm] features: -context_init
Feb  2 04:00:43 np0005604791 kernel: [drm] number of scanouts: 1
Feb  2 04:00:43 np0005604791 kernel: [drm] number of cap sets: 0
Feb  2 04:00:43 np0005604791 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb  2 04:00:43 np0005604791 systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb  2 04:00:43 np0005604791 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb  2 04:00:43 np0005604791 kernel: Console: switching to colour frame buffer device 128x48
Feb  2 04:00:43 np0005604791 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb  2 04:00:43 np0005604791 systemd[1]: Starting IPv4 firewall with iptables...
Feb  2 04:00:43 np0005604791 systemd[1]: Started irqbalance daemon.
Feb  2 04:00:43 np0005604791 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb  2 04:00:43 np0005604791 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:00:43 np0005604791 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:00:43 np0005604791 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target sshd-keygen.target.
Feb  2 04:00:43 np0005604791 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb  2 04:00:43 np0005604791 systemd[1]: Reached target User and Group Name Lookups.
Feb  2 04:00:43 np0005604791 systemd[1]: Starting User Login Management...
Feb  2 04:00:43 np0005604791 systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb  2 04:00:43 np0005604791 chronyd[811]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb  2 04:00:43 np0005604791 chronyd[811]: Loaded 0 symmetric keys
Feb  2 04:00:43 np0005604791 chronyd[811]: Using right/UTC timezone to obtain leap second data
Feb  2 04:00:43 np0005604791 chronyd[811]: Loaded seccomp filter (level 2)
Feb  2 04:00:43 np0005604791 systemd[1]: Started NTP client/server.
Feb  2 04:00:43 np0005604791 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb  2 04:00:43 np0005604791 systemd-logind[805]: New seat seat0.
Feb  2 04:00:43 np0005604791 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Feb  2 04:00:43 np0005604791 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb  2 04:00:43 np0005604791 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb  2 04:00:43 np0005604791 systemd[1]: Started User Login Management.
Feb  2 04:00:43 np0005604791 iptables.init[799]: iptables: Applying firewall rules: [  OK  ]
Feb  2 04:00:43 np0005604791 systemd[1]: Finished IPv4 firewall with iptables.
Feb  2 04:00:44 np0005604791 cloud-init[844]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 02 Feb 2026 09:00:44 +0000. Up 6.51 seconds.
Feb  2 04:00:44 np0005604791 systemd[1]: run-cloud\x2dinit-tmp-tmpimibsutq.mount: Deactivated successfully.
Feb  2 04:00:44 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 04:00:44 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 04:00:44 np0005604791 systemd-hostnamed[858]: Hostname set to <np0005604791.novalocal> (static)
Feb  2 04:00:44 np0005604791 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb  2 04:00:44 np0005604791 systemd[1]: Reached target Preparation for Network.
Feb  2 04:00:44 np0005604791 systemd[1]: Starting Network Manager...
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7656] NetworkManager (version 1.54.3-2.el9) is starting... (boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7662] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7792] manager[0x55c11e6e0000]: monitoring kernel firmware directory '/lib/firmware'.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7827] hostname: hostname: using hostnamed
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7828] hostname: static hostname changed from (none) to "np0005604791.novalocal"
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7830] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7952] manager[0x55c11e6e0000]: rfkill: Wi-Fi hardware radio set enabled
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.7952] manager[0x55c11e6e0000]: rfkill: WWAN hardware radio set enabled
Feb  2 04:00:44 np0005604791 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8045] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8046] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8047] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8048] manager: Networking is enabled by state file
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8050] settings: Loaded settings plugin: keyfile (internal)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8100] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb  2 04:00:44 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8209] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8441] dhcp: init: Using DHCP client 'internal'
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8449] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb  2 04:00:44 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8484] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8531] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8570] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8583] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8589] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:00:44 np0005604791 systemd[1]: Started Network Manager.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8626] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8638] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8645] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 systemd[1]: Reached target Network.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8652] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8659] device (eth0): carrier: link connected
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8664] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8675] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8685] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8695] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8696] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8701] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8709] manager: NetworkManager state is now CONNECTING
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8713] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8722] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8733] device (lo): Activation: successful, device activated.
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8750] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:00:44 np0005604791 NetworkManager[862]: <info>  [1770022844.8758] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:00:44 np0005604791 systemd[1]: Starting Network Manager Wait Online...
Feb  2 04:00:44 np0005604791 systemd[1]: Starting GSSAPI Proxy Daemon...
Feb  2 04:00:44 np0005604791 systemd[1]: Started GSSAPI Proxy Daemon.
Feb  2 04:00:44 np0005604791 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb  2 04:00:44 np0005604791 systemd[1]: Reached target NFS client services.
Feb  2 04:00:44 np0005604791 systemd[1]: Reached target Preparation for Remote File Systems.
Feb  2 04:00:44 np0005604791 systemd[1]: Reached target Remote File Systems.
Feb  2 04:00:44 np0005604791 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6162] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6180] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6206] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6232] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6235] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6239] manager: NetworkManager state is now CONNECTED_SITE
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6244] device (eth0): Activation: successful, device activated.
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6250] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb  2 04:00:45 np0005604791 NetworkManager[862]: <info>  [1770022845.6255] manager: startup complete
Feb  2 04:00:45 np0005604791 systemd[1]: Finished Network Manager Wait Online.
Feb  2 04:00:45 np0005604791 systemd[1]: Starting Cloud-init: Network Stage...
Feb  2 04:00:45 np0005604791 cloud-init[925]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 02 Feb 2026 09:00:45 +0000. Up 8.25 seconds.
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |  eth0  | True |        38.102.83.189         | 255.255.255.0 | global | fa:16:3e:ac:ea:a6 |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |  eth0  | True | fe80::f816:3eff:feac:eaa6/64 |       .       |  link  | fa:16:3e:ac:ea:a6 |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb  2 04:00:45 np0005604791 cloud-init[925]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb  2 04:00:46 np0005604791 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb  2 04:00:48 np0005604791 cloud-init[925]: Generating public/private rsa key pair.
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key fingerprint is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: SHA256:SATBxxODRQB/dRXEaifL7F2mZu61vPlj0VS7a3P8BAc root@np0005604791.novalocal
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key's randomart image is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: +---[RSA 3072]----+
Feb  2 04:00:48 np0005604791 cloud-init[925]: |  .o+O*.. .++.   |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |   .o.+o .  .   .|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |    ..o.   .  E o|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |     o .  + .  o.|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |      . S+ +  ..+|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |          +   o=.|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |         . . +..+|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |          . =o Xo|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |           =o BoB|
Feb  2 04:00:48 np0005604791 cloud-init[925]: +----[SHA256]-----+
Feb  2 04:00:48 np0005604791 cloud-init[925]: Generating public/private ecdsa key pair.
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key fingerprint is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: SHA256:3cWqaKHv8gpYx1XjtiGfkeI9XXQDI6cmrbyggix79RE root@np0005604791.novalocal
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key's randomart image is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: +---[ECDSA 256]---+
Feb  2 04:00:48 np0005604791 cloud-init[925]: |          o. +o..|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |         o.o+o...|
Feb  2 04:00:48 np0005604791 cloud-init[925]: |        +.*+  +  |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |     .Eo.B+B +   |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |    . ooSoB +    |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |. .o..o..o.o     |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |.o.o.o..o..      |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |... ..oo         |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |..    .=+        |
Feb  2 04:00:48 np0005604791 cloud-init[925]: +----[SHA256]-----+
Feb  2 04:00:48 np0005604791 cloud-init[925]: Generating public/private ed25519 key pair.
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb  2 04:00:48 np0005604791 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key fingerprint is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: SHA256:aYizx4EWEpqD0HGvB5mqoP2aGGblYGGFbTG9xBqOYkw root@np0005604791.novalocal
Feb  2 04:00:48 np0005604791 cloud-init[925]: The key's randomart image is:
Feb  2 04:00:48 np0005604791 cloud-init[925]: +--[ED25519 256]--+
Feb  2 04:00:48 np0005604791 cloud-init[925]: |..===            |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |oEo*.B           |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |Bo= O o          |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |o=.= B . .       |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |+o..* + S        |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |+o+. = o         |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |+o... o          |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |oo o .           |
Feb  2 04:00:48 np0005604791 cloud-init[925]: |. o..            |
Feb  2 04:00:48 np0005604791 cloud-init[925]: +----[SHA256]-----+
Feb  2 04:00:48 np0005604791 systemd[1]: Finished Cloud-init: Network Stage.
Feb  2 04:00:48 np0005604791 systemd[1]: Reached target Cloud-config availability.
Feb  2 04:00:48 np0005604791 systemd[1]: Reached target Network is Online.
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Cloud-init: Config Stage...
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Crash recovery kernel arming...
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Notify NFS peers of a restart...
Feb  2 04:00:48 np0005604791 systemd[1]: Starting System Logging Service...
Feb  2 04:00:48 np0005604791 systemd[1]: Starting OpenSSH server daemon...
Feb  2 04:00:48 np0005604791 sm-notify[1008]: Version 2.5.4 starting
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Permit User Sessions...
Feb  2 04:00:48 np0005604791 systemd[1]: Started Notify NFS peers of a restart.
Feb  2 04:00:48 np0005604791 systemd[1]: Finished Permit User Sessions.
Feb  2 04:00:48 np0005604791 systemd[1]: Started Command Scheduler.
Feb  2 04:00:48 np0005604791 systemd[1]: Started Getty on tty1.
Feb  2 04:00:48 np0005604791 systemd[1]: Started Serial Getty on ttyS0.
Feb  2 04:00:48 np0005604791 systemd[1]: Reached target Login Prompts.
Feb  2 04:00:48 np0005604791 systemd[1]: Started OpenSSH server daemon.
Feb  2 04:00:48 np0005604791 rsyslogd[1009]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1009" x-info="https://www.rsyslog.com"] start
Feb  2 04:00:48 np0005604791 rsyslogd[1009]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb  2 04:00:48 np0005604791 systemd[1]: Started System Logging Service.
Feb  2 04:00:48 np0005604791 systemd[1]: Reached target Multi-User System.
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Record Runlevel Change in UTMP...
Feb  2 04:00:48 np0005604791 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb  2 04:00:48 np0005604791 systemd[1]: Finished Record Runlevel Change in UTMP.
Feb  2 04:00:48 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:00:48 np0005604791 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Feb  2 04:00:48 np0005604791 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Feb  2 04:00:48 np0005604791 cloud-init[1154]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 02 Feb 2026 09:00:48 +0000. Up 10.88 seconds.
Feb  2 04:00:48 np0005604791 systemd[1]: Finished Cloud-init: Config Stage.
Feb  2 04:00:48 np0005604791 systemd[1]: Starting Cloud-init: Final Stage...
Feb  2 04:00:48 np0005604791 dracut[1269]: dracut-057-102.git20250818.el9
Feb  2 04:00:48 np0005604791 dracut[1271]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Feb  2 04:00:48 np0005604791 cloud-init[1305]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 02 Feb 2026 09:00:48 +0000. Up 11.23 seconds.
Feb  2 04:00:48 np0005604791 cloud-init[1341]: #############################################################
Feb  2 04:00:48 np0005604791 cloud-init[1342]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb  2 04:00:48 np0005604791 cloud-init[1344]: 256 SHA256:3cWqaKHv8gpYx1XjtiGfkeI9XXQDI6cmrbyggix79RE root@np0005604791.novalocal (ECDSA)
Feb  2 04:00:48 np0005604791 cloud-init[1346]: 256 SHA256:aYizx4EWEpqD0HGvB5mqoP2aGGblYGGFbTG9xBqOYkw root@np0005604791.novalocal (ED25519)
Feb  2 04:00:49 np0005604791 cloud-init[1348]: 3072 SHA256:SATBxxODRQB/dRXEaifL7F2mZu61vPlj0VS7a3P8BAc root@np0005604791.novalocal (RSA)
Feb  2 04:00:49 np0005604791 cloud-init[1349]: -----END SSH HOST KEY FINGERPRINTS-----
Feb  2 04:00:49 np0005604791 cloud-init[1350]: #############################################################
Feb  2 04:00:49 np0005604791 cloud-init[1305]: Cloud-init v. 24.4-8.el9 finished at Mon, 02 Feb 2026 09:00:49 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.41 seconds
Feb  2 04:00:49 np0005604791 systemd[1]: Finished Cloud-init: Final Stage.
Feb  2 04:00:49 np0005604791 systemd[1]: Reached target Cloud-init target.
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: memstrack is not available
Feb  2 04:00:49 np0005604791 dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb  2 04:00:49 np0005604791 dracut[1271]: memstrack is not available
Feb  2 04:00:49 np0005604791 dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: systemd ***
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: fips ***
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: systemd-initrd ***
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: i18n ***
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: drm ***
Feb  2 04:00:50 np0005604791 chronyd[811]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Feb  2 04:00:50 np0005604791 chronyd[811]: System clock TAI offset set to 37 seconds
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: prefixdevname ***
Feb  2 04:00:50 np0005604791 dracut[1271]: *** Including module: kernel-modules ***
Feb  2 04:00:51 np0005604791 kernel: block vda: the capability attribute has been deprecated.
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: kernel-modules-extra ***
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: qemu ***
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: fstab-sys ***
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: rootfs-block ***
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: terminfo ***
Feb  2 04:00:51 np0005604791 dracut[1271]: *** Including module: udev-rules ***
Feb  2 04:00:52 np0005604791 dracut[1271]: Skipping udev rule: 91-permissions.rules
Feb  2 04:00:52 np0005604791 dracut[1271]: Skipping udev rule: 80-drivers-modprobe.rules
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: virtiofs ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: dracut-systemd ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: usrmount ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: base ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: fs-lib ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: kdumpbase ***
Feb  2 04:00:52 np0005604791 dracut[1271]: *** Including module: microcode_ctl-fw_dir_override ***
Feb  2 04:00:52 np0005604791 dracut[1271]:  microcode_ctl module: mangling fw_dir
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel" is ignored
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Feb  2 04:00:52 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb  2 04:00:53 np0005604791 dracut[1271]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb  2 04:00:53 np0005604791 dracut[1271]: *** Including module: openssl ***
Feb  2 04:00:53 np0005604791 dracut[1271]: *** Including module: shutdown ***
Feb  2 04:00:53 np0005604791 dracut[1271]: *** Including module: squash ***
Feb  2 04:00:53 np0005604791 dracut[1271]: *** Including modules done ***
Feb  2 04:00:53 np0005604791 dracut[1271]: *** Installing kernel module dependencies ***
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 25 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 25 affinity is now unmanaged
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 31 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 31 affinity is now unmanaged
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 28 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 28 affinity is now unmanaged
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 32 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 32 affinity is now unmanaged
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 30 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 30 affinity is now unmanaged
Feb  2 04:00:53 np0005604791 irqbalance[804]: Cannot change IRQ 29 affinity: Operation not permitted
Feb  2 04:00:53 np0005604791 irqbalance[804]: IRQ 29 affinity is now unmanaged
Feb  2 04:00:54 np0005604791 dracut[1271]: *** Installing kernel module dependencies done ***
Feb  2 04:00:54 np0005604791 dracut[1271]: *** Resolving executable dependencies ***
Feb  2 04:00:55 np0005604791 dracut[1271]: *** Resolving executable dependencies done ***
Feb  2 04:00:55 np0005604791 dracut[1271]: *** Generating early-microcode cpio image ***
Feb  2 04:00:55 np0005604791 dracut[1271]: *** Store current command line parameters ***
Feb  2 04:00:55 np0005604791 dracut[1271]: Stored kernel commandline:
Feb  2 04:00:55 np0005604791 dracut[1271]: No dracut internal kernel commandline stored in the initramfs
Feb  2 04:00:55 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:00:55 np0005604791 dracut[1271]: *** Install squash loader ***
Feb  2 04:00:56 np0005604791 dracut[1271]: *** Squashing the files inside the initramfs ***
Feb  2 04:00:57 np0005604791 dracut[1271]: *** Squashing the files inside the initramfs done ***
Feb  2 04:00:57 np0005604791 dracut[1271]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Feb  2 04:00:57 np0005604791 dracut[1271]: *** Hardlinking files ***
Feb  2 04:00:57 np0005604791 dracut[1271]: *** Hardlinking files done ***
Feb  2 04:00:58 np0005604791 dracut[1271]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Feb  2 04:00:58 np0005604791 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Feb  2 04:00:58 np0005604791 kdumpctl[1018]: kdump: Starting kdump: [OK]
Feb  2 04:00:58 np0005604791 systemd[1]: Finished Crash recovery kernel arming.
Feb  2 04:00:58 np0005604791 systemd[1]: Startup finished in 1.235s (kernel) + 2.610s (initrd) + 17.100s (userspace) = 20.946s.
Feb  2 04:01:14 np0005604791 systemd[1]: Created slice User Slice of UID 1000.
Feb  2 04:01:14 np0005604791 systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb  2 04:01:14 np0005604791 systemd-logind[805]: New session 1 of user zuul.
Feb  2 04:01:14 np0005604791 systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb  2 04:01:14 np0005604791 systemd[1]: Starting User Manager for UID 1000...
Feb  2 04:01:14 np0005604791 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb  2 04:01:14 np0005604791 systemd[4325]: Queued start job for default target Main User Target.
Feb  2 04:01:14 np0005604791 systemd[4325]: Created slice User Application Slice.
Feb  2 04:01:14 np0005604791 systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Feb  2 04:01:14 np0005604791 systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Feb  2 04:01:14 np0005604791 systemd[4325]: Reached target Paths.
Feb  2 04:01:14 np0005604791 systemd[4325]: Reached target Timers.
Feb  2 04:01:14 np0005604791 systemd[4325]: Starting D-Bus User Message Bus Socket...
Feb  2 04:01:14 np0005604791 systemd[4325]: Starting Create User's Volatile Files and Directories...
Feb  2 04:01:14 np0005604791 systemd[4325]: Finished Create User's Volatile Files and Directories.
Feb  2 04:01:14 np0005604791 systemd[4325]: Listening on D-Bus User Message Bus Socket.
Feb  2 04:01:14 np0005604791 systemd[4325]: Reached target Sockets.
Feb  2 04:01:14 np0005604791 systemd[4325]: Reached target Basic System.
Feb  2 04:01:14 np0005604791 systemd[4325]: Reached target Main User Target.
Feb  2 04:01:14 np0005604791 systemd[4325]: Startup finished in 179ms.
Feb  2 04:01:14 np0005604791 systemd[1]: Started User Manager for UID 1000.
Feb  2 04:01:14 np0005604791 systemd[1]: Started Session 1 of User zuul.
Feb  2 04:01:15 np0005604791 python3[4409]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:01:18 np0005604791 python3[4437]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:01:26 np0005604791 python3[4495]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:01:27 np0005604791 python3[4535]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb  2 04:01:29 np0005604791 python3[4561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDres8I0e2lx2XlkDi/o8mbn7A8kJLvscauEMeSccA/Q28EgVAaHKAaMzB7MTuExuZhV2hKdHCjChvbo+ZEItJb42XILxS2oD7nNZFvVgzBQniv52jPQzNymZKv6xSxlAe2fhEntL1UKK7rrlHSbTvpCdGBhDUQsTkZLTXEabEEU2AUKrMcF1w86Dag94m2LcmlUNBhMgEGG2gCAwR3LArhvliT36AiA+uCD9ZLWOYPkktaBOoVTE2SXaHLM/QcLtQ9fjx6HlaVH0Yhtj7rqVbzUqi90TmhLPQuW8eD8LtDzn9vdNraZXTqHagLV5n5OxOivwbk4MGal3/4FVMfbvwmkxfPWWHnq9CpCjdr2/8NZkLs7rZjZtRj+oszTemHh2fSvs0qv1+QN2N9Fo3lRt/o3COnsw0ktNu6Xln+nqj4Bt/yqB5VmDCXaqp2DHhGlCM3XpR2F7xlpNITVJVPl9bGLc9YHytFHIM9fCjt1aMlyP028PhHIHlcB7LcSSd5QM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:29 np0005604791 python3[4585]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:30 np0005604791 python3[4684]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:30 np0005604791 python3[4755]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770022889.827167-252-9767779315071/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b133c3a79151467e8c6849ab0367df01_id_rsa follow=False checksum=97092328ac9cd34b53b1d81cf7562eb94a095d6b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:31 np0005604791 python3[4878]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:31 np0005604791 python3[4949]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770022890.7703822-307-164783705280026/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b133c3a79151467e8c6849ab0367df01_id_rsa.pub follow=False checksum=79261b1251eaaf0ed818421d3062a6de11fbecf0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:32 np0005604791 python3[4997]: ansible-ping Invoked with data=pong
Feb  2 04:01:33 np0005604791 python3[5021]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:01:36 np0005604791 python3[5079]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb  2 04:01:37 np0005604791 python3[5111]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:37 np0005604791 python3[5135]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:37 np0005604791 python3[5159]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:38 np0005604791 python3[5183]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:38 np0005604791 python3[5207]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:38 np0005604791 python3[5231]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:40 np0005604791 python3[5257]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:40 np0005604791 python3[5335]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:41 np0005604791 python3[5408]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1770022900.5012546-32-160872255291395/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:42 np0005604791 python3[5456]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:42 np0005604791 python3[5480]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:42 np0005604791 python3[5504]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:42 np0005604791 python3[5528]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:43 np0005604791 python3[5552]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:43 np0005604791 python3[5576]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:43 np0005604791 python3[5600]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:43 np0005604791 python3[5624]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:44 np0005604791 python3[5648]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:44 np0005604791 python3[5672]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:44 np0005604791 python3[5696]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:44 np0005604791 python3[5720]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:45 np0005604791 python3[5744]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:45 np0005604791 python3[5768]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:45 np0005604791 python3[5792]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:46 np0005604791 python3[5816]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:46 np0005604791 python3[5840]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:46 np0005604791 python3[5864]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:46 np0005604791 python3[5888]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:47 np0005604791 python3[5912]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:47 np0005604791 python3[5936]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:47 np0005604791 python3[5960]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:47 np0005604791 python3[5984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:48 np0005604791 python3[6008]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:48 np0005604791 python3[6032]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:48 np0005604791 python3[6056]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:01:51 np0005604791 python3[6082]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb  2 04:01:51 np0005604791 systemd[1]: Starting Time & Date Service...
Feb  2 04:01:51 np0005604791 systemd[1]: Started Time & Date Service.
Feb  2 04:01:51 np0005604791 systemd-timedated[6084]: Changed time zone to 'UTC' (UTC).
Feb  2 04:01:51 np0005604791 python3[6113]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:52 np0005604791 python3[6189]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:52 np0005604791 python3[6260]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1770022912.0679753-252-275154762755933/source _original_basename=tmp7vuu01de follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:53 np0005604791 python3[6360]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:53 np0005604791 python3[6431]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1770022912.8973043-303-57341955622931/source _original_basename=tmpc0ih3ok9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:54 np0005604791 python3[6533]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:54 np0005604791 python3[6606]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1770022914.0356104-382-146128306350918/source _original_basename=tmpzimi10cz follow=False checksum=b9ea63fb38f50d3257ec076159ca59d9b4b7fe2c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:55 np0005604791 python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:01:55 np0005604791 python3[6680]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:01:55 np0005604791 python3[6760]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:01:56 np0005604791 python3[6833]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1770022915.592871-452-226389565636795/source _original_basename=tmppg98_f17 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:01:56 np0005604791 python3[6884]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-385c-c5be-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:01:57 np0005604791 python3[6912]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-385c-c5be-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb  2 04:01:58 np0005604791 python3[6940]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:02:03 np0005604791 irqbalance[804]: Cannot change IRQ 27 affinity: Operation not permitted
Feb  2 04:02:03 np0005604791 irqbalance[804]: IRQ 27 affinity is now unmanaged
Feb  2 04:02:17 np0005604791 python3[6966]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:02:21 np0005604791 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb  2 04:03:17 np0005604791 systemd-logind[805]: Session 1 logged out. Waiting for processes to exit.
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb  2 04:03:29 np0005604791 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb  2 04:03:29 np0005604791 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0160] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb  2 04:03:30 np0005604791 systemd-udevd[6970]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0299] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0330] settings: (eth1): created default wired connection 'Wired connection 1'
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0336] device (eth1): carrier: link connected
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0339] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0348] policy: auto-activating connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0355] device (eth1): Activation: starting connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0356] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0362] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0368] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:03:30 np0005604791 NetworkManager[862]: <info>  [1770023010.0375] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:03:30 np0005604791 systemd[4325]: Starting Mark boot as successful...
Feb  2 04:03:30 np0005604791 systemd[4325]: Finished Mark boot as successful.
Feb  2 04:03:31 np0005604791 systemd-logind[805]: New session 3 of user zuul.
Feb  2 04:03:31 np0005604791 systemd[1]: Started Session 3 of User zuul.
Feb  2 04:03:31 np0005604791 python3[7001]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-26e4-4fa2-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:03:41 np0005604791 python3[7081]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:03:42 np0005604791 python3[7154]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770023021.5348797-155-64804940174475/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=41a2bd022780f84a2a9b026b65aafb4433cf3332 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:03:42 np0005604791 python3[7204]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:03:42 np0005604791 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb  2 04:03:42 np0005604791 systemd[1]: Stopped Network Manager Wait Online.
Feb  2 04:03:42 np0005604791 systemd[1]: Stopping Network Manager Wait Online...
Feb  2 04:03:42 np0005604791 systemd[1]: Stopping Network Manager...
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8048] caught SIGTERM, shutting down normally.
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8062] dhcp4 (eth0): canceled DHCP transaction
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8063] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8063] dhcp4 (eth0): state changed no lease
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8066] manager: NetworkManager state is now CONNECTING
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8192] dhcp4 (eth1): canceled DHCP transaction
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8192] dhcp4 (eth1): state changed no lease
Feb  2 04:03:42 np0005604791 NetworkManager[862]: <info>  [1770023022.8263] exiting (success)
Feb  2 04:03:42 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:03:42 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:03:42 np0005604791 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb  2 04:03:42 np0005604791 systemd[1]: Stopped Network Manager.
Feb  2 04:03:42 np0005604791 systemd[1]: NetworkManager.service: Consumed 1.524s CPU time, 9.9M memory peak.
Feb  2 04:03:42 np0005604791 systemd[1]: Starting Network Manager...
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.8881] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.8882] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.8951] manager[0x55e12412b000]: monitoring kernel firmware directory '/lib/firmware'.
Feb  2 04:03:42 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 04:03:42 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9628] hostname: hostname: using hostnamed
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9629] hostname: static hostname changed from (none) to "np0005604791.novalocal"
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9637] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9642] manager[0x55e12412b000]: rfkill: Wi-Fi hardware radio set enabled
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9643] manager[0x55e12412b000]: rfkill: WWAN hardware radio set enabled
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9683] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9683] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9684] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9685] manager: Networking is enabled by state file
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9688] settings: Loaded settings plugin: keyfile (internal)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9693] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9730] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9743] dhcp: init: Using DHCP client 'internal'
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9747] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9755] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9762] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9775] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9783] device (eth0): carrier: link connected
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9789] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9798] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9799] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9808] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9818] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9825] device (eth1): carrier: link connected
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9831] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9838] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a) (indicated)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9839] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9846] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9856] device (eth1): Activation: starting connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9864] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb  2 04:03:42 np0005604791 systemd[1]: Started Network Manager.
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9869] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9874] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9876] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9879] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9883] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9886] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9889] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9892] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9902] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9906] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9918] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9922] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9945] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9951] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9959] device (lo): Activation: successful, device activated.
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9970] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb  2 04:03:42 np0005604791 NetworkManager[7213]: <info>  [1770023022.9980] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb  2 04:03:42 np0005604791 systemd[1]: Starting Network Manager Wait Online...
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0083] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0105] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0108] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0112] manager: NetworkManager state is now CONNECTED_SITE
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0117] device (eth0): Activation: successful, device activated.
Feb  2 04:03:43 np0005604791 NetworkManager[7213]: <info>  [1770023023.0124] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb  2 04:03:43 np0005604791 python3[7288]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-26e4-4fa2-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:03:53 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:04:12 np0005604791 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6370] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb  2 04:04:28 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:04:28 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6635] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6642] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6659] device (eth1): Activation: successful, device activated.
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6673] manager: startup complete
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6681] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <warn>  [1770023068.6698] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6710] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 systemd[1]: Finished Network Manager Wait Online.
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6812] dhcp4 (eth1): canceled DHCP transaction
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6812] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6813] dhcp4 (eth1): state changed no lease
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6830] policy: auto-activating connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6837] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6838] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6842] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6850] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.6861] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.7217] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.7221] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:04:28 np0005604791 NetworkManager[7213]: <info>  [1770023068.7231] device (eth1): Activation: successful, device activated.
Feb  2 04:04:38 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:04:43 np0005604791 systemd[1]: session-3.scope: Deactivated successfully.
Feb  2 04:04:43 np0005604791 systemd[1]: session-3.scope: Consumed 1.513s CPU time.
Feb  2 04:04:43 np0005604791 systemd-logind[805]: Session 3 logged out. Waiting for processes to exit.
Feb  2 04:04:43 np0005604791 systemd-logind[805]: Removed session 3.
Feb  2 04:05:22 np0005604791 systemd-logind[805]: New session 4 of user zuul.
Feb  2 04:05:22 np0005604791 systemd[1]: Started Session 4 of User zuul.
Feb  2 04:05:22 np0005604791 python3[7397]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:05:22 np0005604791 python3[7470]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023122.3360052-373-268191696952508/source _original_basename=tmp3foveb5q follow=False checksum=f14c371f1ecf34b9a35f6f9273fe37702180eaed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:05:23 np0005604791 irqbalance[804]: Cannot change IRQ 26 affinity: Operation not permitted
Feb  2 04:05:23 np0005604791 irqbalance[804]: IRQ 26 affinity is now unmanaged
Feb  2 04:05:25 np0005604791 systemd[1]: session-4.scope: Deactivated successfully.
Feb  2 04:05:25 np0005604791 systemd-logind[805]: Session 4 logged out. Waiting for processes to exit.
Feb  2 04:05:25 np0005604791 systemd-logind[805]: Removed session 4.
Feb  2 04:06:52 np0005604791 systemd[4325]: Created slice User Background Tasks Slice.
Feb  2 04:06:52 np0005604791 systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Feb  2 04:06:52 np0005604791 systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Feb  2 04:13:15 np0005604791 systemd-logind[805]: New session 5 of user zuul.
Feb  2 04:13:15 np0005604791 systemd[1]: Started Session 5 of User zuul.
Feb  2 04:13:16 np0005604791 python3[7530]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-f989-50d9-00000000217d-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:16 np0005604791 python3[7558]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:16 np0005604791 python3[7585]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:16 np0005604791 python3[7611]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:17 np0005604791 python3[7637]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:17 np0005604791 python3[7663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:18 np0005604791 python3[7741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:13:18 np0005604791 python3[7814]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023598.109643-547-108383498769253/source _original_basename=tmplgda88eh follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:13:19 np0005604791 python3[7864]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:13:19 np0005604791 systemd[1]: Reloading.
Feb  2 04:13:19 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:13:21 np0005604791 python3[7920]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb  2 04:13:21 np0005604791 python3[7946]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:22 np0005604791 python3[7974]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:22 np0005604791 python3[8002]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:22 np0005604791 python3[8030]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:23 np0005604791 python3[8057]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-f989-50d9-000000002184-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:13:23 np0005604791 python3[8087]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb  2 04:13:26 np0005604791 systemd[1]: session-5.scope: Deactivated successfully.
Feb  2 04:13:26 np0005604791 systemd[1]: session-5.scope: Consumed 3.969s CPU time.
Feb  2 04:13:26 np0005604791 systemd-logind[805]: Session 5 logged out. Waiting for processes to exit.
Feb  2 04:13:26 np0005604791 systemd-logind[805]: Removed session 5.
Feb  2 04:13:28 np0005604791 systemd-logind[805]: New session 6 of user zuul.
Feb  2 04:13:28 np0005604791 systemd[1]: Started Session 6 of User zuul.
Feb  2 04:13:28 np0005604791 python3[8121]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb  2 04:13:35 np0005604791 setsebool[8164]: The virt_use_nfs policy boolean was changed to 1 by root
Feb  2 04:13:35 np0005604791 setsebool[8164]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb  2 04:13:45 np0005604791 kernel: SELinux:  Converting 386 SID table entries...
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:13:45 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  Converting 389 SID table entries...
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:13:54 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:14:11 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb  2 04:14:11 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:14:12 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:14:12 np0005604791 systemd[1]: Reloading.
Feb  2 04:14:12 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:14:12 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:14:16 np0005604791 python3[12810]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-cc31-c4d5-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:14:17 np0005604791 kernel: evm: overlay not supported
Feb  2 04:14:17 np0005604791 systemd[4325]: Starting D-Bus User Message Bus...
Feb  2 04:14:17 np0005604791 systemd[4325]: Started D-Bus User Message Bus.
Feb  2 04:14:17 np0005604791 dbus-broker-launch[13941]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb  2 04:14:17 np0005604791 dbus-broker-launch[13941]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb  2 04:14:17 np0005604791 dbus-broker-lau[13941]: Ready
Feb  2 04:14:17 np0005604791 systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb  2 04:14:17 np0005604791 systemd[4325]: Created slice Slice /user.
Feb  2 04:14:17 np0005604791 systemd[4325]: podman-13840.scope: unit configures an IP firewall, but not running as root.
Feb  2 04:14:17 np0005604791 systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Feb  2 04:14:17 np0005604791 systemd[4325]: Started podman-13840.scope.
Feb  2 04:14:17 np0005604791 systemd[4325]: Started podman-pause-189668ee.scope.
Feb  2 04:14:18 np0005604791 systemd[1]: session-6.scope: Deactivated successfully.
Feb  2 04:14:18 np0005604791 systemd[1]: session-6.scope: Consumed 40.101s CPU time.
Feb  2 04:14:18 np0005604791 systemd-logind[805]: Session 6 logged out. Waiting for processes to exit.
Feb  2 04:14:18 np0005604791 systemd-logind[805]: Removed session 6.
Feb  2 04:14:38 np0005604791 systemd-logind[805]: New session 7 of user zuul.
Feb  2 04:14:38 np0005604791 systemd[1]: Started Session 7 of User zuul.
Feb  2 04:14:38 np0005604791 python3[23541]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:14:39 np0005604791 python3[23770]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:14:39 np0005604791 python3[24122]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604791.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb  2 04:14:40 np0005604791 python3[24336]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb  2 04:14:40 np0005604791 python3[24632]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:14:41 np0005604791 python3[24874]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023680.4837782-151-178742569184994/source _original_basename=tmpklhajr8w follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:14:42 np0005604791 python3[25256]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Feb  2 04:14:42 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 04:14:42 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 04:14:42 np0005604791 systemd-hostnamed[25372]: Changed pretty hostname to 'compute-1'
Feb  2 04:14:42 np0005604791 systemd-hostnamed[25372]: Hostname set to <compute-1> (static)
Feb  2 04:14:42 np0005604791 NetworkManager[7213]: <info>  [1770023682.1595] hostname: static hostname changed from "np0005604791.novalocal" to "compute-1"
Feb  2 04:14:42 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:14:42 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:14:42 np0005604791 systemd[1]: session-7.scope: Deactivated successfully.
Feb  2 04:14:42 np0005604791 systemd[1]: session-7.scope: Consumed 2.271s CPU time.
Feb  2 04:14:42 np0005604791 systemd-logind[805]: Session 7 logged out. Waiting for processes to exit.
Feb  2 04:14:42 np0005604791 systemd-logind[805]: Removed session 7.
Feb  2 04:14:52 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:14:52 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:14:52 np0005604791 systemd[1]: man-db-cache-update.service: Consumed 46.634s CPU time.
Feb  2 04:14:52 np0005604791 systemd[1]: run-r03f04290d3444c01b388663d1379ff93.service: Deactivated successfully.
Feb  2 04:14:52 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:15:12 np0005604791 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb  2 04:15:52 np0005604791 systemd[1]: Starting Cleanup of Temporary Directories...
Feb  2 04:15:52 np0005604791 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb  2 04:15:52 np0005604791 systemd[1]: Finished Cleanup of Temporary Directories.
Feb  2 04:15:52 np0005604791 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb  2 04:18:07 np0005604791 systemd-logind[805]: New session 8 of user zuul.
Feb  2 04:18:07 np0005604791 systemd[1]: Started Session 8 of User zuul.
Feb  2 04:18:07 np0005604791 python3[30065]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:18:09 np0005604791 python3[30181]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:09 np0005604791 python3[30254]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:10 np0005604791 python3[30280]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:10 np0005604791 python3[30353]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:10 np0005604791 python3[30379]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:11 np0005604791 python3[30452]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:11 np0005604791 python3[30478]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:11 np0005604791 python3[30551]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:11 np0005604791 python3[30577]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:12 np0005604791 python3[30650]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:12 np0005604791 python3[30676]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:12 np0005604791 python3[30749]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:13 np0005604791 python3[30775]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:18:13 np0005604791 python3[30848]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:18:25 np0005604791 python3[30896]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:23:24 np0005604791 systemd[1]: session-8.scope: Deactivated successfully.
Feb  2 04:23:24 np0005604791 systemd[1]: session-8.scope: Consumed 5.195s CPU time.
Feb  2 04:23:24 np0005604791 systemd-logind[805]: Session 8 logged out. Waiting for processes to exit.
Feb  2 04:23:24 np0005604791 systemd-logind[805]: Removed session 8.
Feb  2 04:28:52 np0005604791 systemd[1]: Starting dnf makecache...
Feb  2 04:28:52 np0005604791 dnf[30904]: Failed determining last makecache time.
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-openstack-barbican-42b4c41831408a8e323 344 kB/s |  13 kB     00:00
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-python-glean-642fffe0203a8ffcc2443db52 2.4 MB/s |  65 kB     00:00
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-python-stevedore-c4acc5639fd2329372142 5.0 MB/s | 131 kB     00:00
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-python-cloudkitty-tests-tempest-783703 1.4 MB/s |  32 kB     00:00
Feb  2 04:28:52 np0005604791 dnf[30904]: delorean-diskimage-builder-61b717cc45660834fe9a  11 MB/s | 349 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-nova-eaa65f0b85123a4ee343246 1.7 MB/s |  42 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-python-designate-tests-tempest-347fdbc 645 kB/s |  18 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-glance-1fd12c29b339f30fe823e 655 kB/s |  18 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.0 MB/s |  29 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-manila-d783d10e75495b73866db 897 kB/s |  25 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-neutron-95cadbd379667c8520c8 5.3 MB/s | 154 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-octavia-5975097dd4b021385178 946 kB/s |  26 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-watcher-c014f81a8647287f6dcc 593 kB/s |  16 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-python-tcib-78032d201b02cee27e8e644c61 307 kB/s | 7.4 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.0 MB/s | 144 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-swift-dc98a8463506ac520c469a 489 kB/s |  14 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-python-tempestconf-8515371b7cceebd4282 2.1 MB/s |  53 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.3 MB/s |  96 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: CentOS Stream 9 - BaseOS                         51 kB/s | 6.7 kB     00:00
Feb  2 04:28:53 np0005604791 dnf[30904]: CentOS Stream 9 - AppStream                      54 kB/s | 6.8 kB     00:00
Feb  2 04:28:54 np0005604791 dnf[30904]: CentOS Stream 9 - CRB                            28 kB/s | 6.6 kB     00:00
Feb  2 04:28:54 np0005604791 dnf[30904]: CentOS Stream 9 - Extras packages                55 kB/s | 7.3 kB     00:00
Feb  2 04:28:54 np0005604791 dnf[30904]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Feb  2 04:28:54 np0005604791 dnf[30904]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Feb  2 04:28:55 np0005604791 dnf[30904]: centos9-rabbitmq                                9.2 MB/s | 123 kB     00:00
Feb  2 04:28:55 np0005604791 dnf[30904]: centos9-storage                                  20 MB/s | 415 kB     00:00
Feb  2 04:28:55 np0005604791 dnf[30904]: centos9-opstools                                4.0 MB/s |  51 kB     00:00
Feb  2 04:28:55 np0005604791 dnf[30904]: NFV SIG OpenvSwitch                              26 MB/s | 461 kB     00:00
Feb  2 04:28:55 np0005604791 dnf[30904]: repo-setup-centos-appstream                     121 MB/s |  26 MB     00:00
Feb  2 04:29:01 np0005604791 dnf[30904]: repo-setup-centos-baseos                        101 MB/s | 8.9 MB     00:00
Feb  2 04:29:02 np0005604791 dnf[30904]: repo-setup-centos-highavailability               32 MB/s | 744 kB     00:00
Feb  2 04:29:02 np0005604791 dnf[30904]: repo-setup-centos-powertools                     90 MB/s | 7.6 MB     00:00
Feb  2 04:29:05 np0005604791 dnf[30904]: Extra Packages for Enterprise Linux 9 - x86_64   17 MB/s |  20 MB     00:01
Feb  2 04:29:17 np0005604791 dnf[30904]: Metadata cache created.
Feb  2 04:29:17 np0005604791 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb  2 04:29:17 np0005604791 systemd[1]: Finished dnf makecache.
Feb  2 04:29:17 np0005604791 systemd[1]: dnf-makecache.service: Consumed 23.152s CPU time.
Feb  2 04:29:51 np0005604791 systemd-logind[805]: New session 9 of user zuul.
Feb  2 04:29:51 np0005604791 systemd[1]: Started Session 9 of User zuul.
Feb  2 04:29:52 np0005604791 python3.9[31159]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:29:53 np0005604791 python3.9[31340]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:30:00 np0005604791 systemd[1]: session-9.scope: Deactivated successfully.
Feb  2 04:30:00 np0005604791 systemd[1]: session-9.scope: Consumed 7.080s CPU time.
Feb  2 04:30:00 np0005604791 systemd-logind[805]: Session 9 logged out. Waiting for processes to exit.
Feb  2 04:30:00 np0005604791 systemd-logind[805]: Removed session 9.
Feb  2 04:30:16 np0005604791 systemd-logind[805]: New session 10 of user zuul.
Feb  2 04:30:16 np0005604791 systemd[1]: Started Session 10 of User zuul.
Feb  2 04:30:17 np0005604791 python3.9[31551]: ansible-ansible.legacy.ping Invoked with data=pong
Feb  2 04:30:18 np0005604791 python3.9[31725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:30:19 np0005604791 python3.9[31877]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:30:20 np0005604791 python3.9[32030]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:30:20 np0005604791 python3.9[32182]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:30:21 np0005604791 python3.9[32334]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:30:22 np0005604791 python3.9[32457]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024621.0426342-173-256343527697352/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:30:22 np0005604791 python3.9[32609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:30:23 np0005604791 python3.9[32765]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:30:24 np0005604791 python3.9[32917]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:30:25 np0005604791 python3.9[33067]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:30:29 np0005604791 python3.9[33320]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:30:30 np0005604791 python3.9[33470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:30:31 np0005604791 python3.9[33624]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:30:32 np0005604791 python3.9[33782]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:30:33 np0005604791 python3.9[33866]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:31:13 np0005604791 systemd[1]: Reloading.
Feb  2 04:31:13 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:31:13 np0005604791 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb  2 04:31:14 np0005604791 systemd[1]: Reloading.
Feb  2 04:31:14 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:31:14 np0005604791 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb  2 04:31:14 np0005604791 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb  2 04:31:14 np0005604791 systemd[1]: Reloading.
Feb  2 04:31:14 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:31:14 np0005604791 systemd[1]: Listening on LVM2 poll daemon socket.
Feb  2 04:31:14 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:31:14 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:31:14 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:32:08 np0005604791 kernel: SELinux:  Converting 2728 SID table entries...
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:32:08 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:32:08 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb  2 04:32:08 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:32:08 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:32:08 np0005604791 systemd[1]: Reloading.
Feb  2 04:32:08 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:32:09 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:32:09 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:32:09 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:32:09 np0005604791 systemd[1]: run-r8e3c2b150b6f4bd895136e821349cf4e.service: Deactivated successfully.
Feb  2 04:32:11 np0005604791 python3.9[35394]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:32:14 np0005604791 python3.9[35676]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb  2 04:32:15 np0005604791 python3.9[35828]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb  2 04:32:17 np0005604791 python3.9[35982]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:32:18 np0005604791 python3.9[36136]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb  2 04:32:19 np0005604791 python3.9[36288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:32:20 np0005604791 python3.9[36440]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:32:25 np0005604791 python3.9[36563]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024740.1902425-662-60105461617986/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:32:26 np0005604791 python3.9[36715]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:32:27 np0005604791 python3.9[36867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:32:28 np0005604791 python3.9[37020]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:32:29 np0005604791 python3.9[37172]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb  2 04:32:29 np0005604791 python3.9[37325]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:32:29 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:32:30 np0005604791 python3.9[37484]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb  2 04:32:31 np0005604791 python3.9[37644]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb  2 04:32:32 np0005604791 python3.9[37797]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:32:32 np0005604791 python3.9[37955]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb  2 04:32:33 np0005604791 python3.9[38107]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:32:36 np0005604791 python3.9[38260]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:32:36 np0005604791 python3.9[38412]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:32:37 np0005604791 python3.9[38535]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024756.255102-1020-264444790041307/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:32:38 np0005604791 python3.9[38687]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:32:38 np0005604791 systemd[1]: Starting Load Kernel Modules...
Feb  2 04:32:38 np0005604791 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb  2 04:32:38 np0005604791 kernel: Bridge firewalling registered
Feb  2 04:32:38 np0005604791 systemd-modules-load[38691]: Inserted module 'br_netfilter'
Feb  2 04:32:38 np0005604791 systemd[1]: Finished Load Kernel Modules.
Feb  2 04:32:39 np0005604791 python3.9[38846]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:32:39 np0005604791 python3.9[38969]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024758.661151-1088-37317981670626/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:32:40 np0005604791 python3.9[39121]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:32:43 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:32:43 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:32:44 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:32:44 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:32:44 np0005604791 systemd[1]: Reloading.
Feb  2 04:32:44 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:32:44 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:32:45 np0005604791 python3.9[41039]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:32:46 np0005604791 python3.9[42243]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb  2 04:32:47 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:32:47 np0005604791 python3.9[43134]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:32:47 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:32:47 np0005604791 systemd[1]: man-db-cache-update.service: Consumed 3.586s CPU time.
Feb  2 04:32:47 np0005604791 systemd[1]: run-r4d41e351b2a34d2086783bc988b436fd.service: Deactivated successfully.
Feb  2 04:32:47 np0005604791 python3.9[43325]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:32:48 np0005604791 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb  2 04:32:48 np0005604791 systemd[1]: Starting Authorization Manager...
Feb  2 04:32:48 np0005604791 systemd[1]: Started Dynamic System Tuning Daemon.
Feb  2 04:32:48 np0005604791 polkitd[43542]: Started polkitd version 0.117
Feb  2 04:32:48 np0005604791 systemd[1]: Started Authorization Manager.
Feb  2 04:32:49 np0005604791 python3.9[43712]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:32:50 np0005604791 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb  2 04:32:50 np0005604791 systemd[1]: tuned.service: Deactivated successfully.
Feb  2 04:32:50 np0005604791 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb  2 04:32:50 np0005604791 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb  2 04:32:50 np0005604791 systemd[1]: Started Dynamic System Tuning Daemon.
Feb  2 04:32:51 np0005604791 python3.9[43873]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb  2 04:32:55 np0005604791 python3.9[44025]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:32:55 np0005604791 systemd[1]: Reloading.
Feb  2 04:32:55 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:32:56 np0005604791 python3.9[44214]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:32:56 np0005604791 systemd[1]: Reloading.
Feb  2 04:32:56 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:32:57 np0005604791 python3.9[44403]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:32:57 np0005604791 python3.9[44556]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:32:57 np0005604791 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb  2 04:32:58 np0005604791 python3.9[44709]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:33:00 np0005604791 python3.9[44871]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:33:01 np0005604791 python3.9[45024]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:33:01 np0005604791 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb  2 04:33:01 np0005604791 systemd[1]: Stopped Apply Kernel Variables.
Feb  2 04:33:01 np0005604791 systemd[1]: Stopping Apply Kernel Variables...
Feb  2 04:33:01 np0005604791 systemd[1]: Starting Apply Kernel Variables...
Feb  2 04:33:01 np0005604791 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb  2 04:33:01 np0005604791 systemd[1]: Finished Apply Kernel Variables.
Feb  2 04:33:02 np0005604791 systemd[1]: session-10.scope: Deactivated successfully.
Feb  2 04:33:02 np0005604791 systemd[1]: session-10.scope: Consumed 1min 59.714s CPU time.
Feb  2 04:33:02 np0005604791 systemd-logind[805]: Session 10 logged out. Waiting for processes to exit.
Feb  2 04:33:02 np0005604791 systemd-logind[805]: Removed session 10.
Feb  2 04:33:07 np0005604791 systemd-logind[805]: New session 11 of user zuul.
Feb  2 04:33:07 np0005604791 systemd[1]: Started Session 11 of User zuul.
Feb  2 04:33:08 np0005604791 python3.9[45207]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:33:10 np0005604791 python3.9[45363]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb  2 04:33:11 np0005604791 python3.9[45516]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:33:11 np0005604791 python3.9[45674]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb  2 04:33:13 np0005604791 python3.9[45834]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:33:14 np0005604791 python3.9[45918]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb  2 04:33:16 np0005604791 python3.9[46081]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:33:26 np0005604791 kernel: SELinux:  Converting 2740 SID table entries...
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:33:26 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:33:27 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb  2 04:33:27 np0005604791 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb  2 04:33:28 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:33:28 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:33:28 np0005604791 systemd[1]: Reloading.
Feb  2 04:33:28 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:33:28 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:33:28 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:33:28 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:33:28 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:33:28 np0005604791 systemd[1]: run-r5dc8cebddb074e7ba314dac0c9caa4ca.service: Deactivated successfully.
Feb  2 04:33:30 np0005604791 python3.9[47181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:33:30 np0005604791 systemd[1]: Reloading.
Feb  2 04:33:30 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:33:30 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:33:30 np0005604791 systemd[1]: Starting Open vSwitch Database Unit...
Feb  2 04:33:30 np0005604791 chown[47223]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb  2 04:33:30 np0005604791 ovs-ctl[47228]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb  2 04:33:30 np0005604791 ovs-ctl[47228]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb  2 04:33:30 np0005604791 ovs-ctl[47228]: Starting ovsdb-server [  OK  ]
Feb  2 04:33:30 np0005604791 ovs-vsctl[47278]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb  2 04:33:31 np0005604791 ovs-vsctl[47298]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2f54a3b0-231a-4b96-9e3a-0a36e3e73216\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb  2 04:33:31 np0005604791 ovs-ctl[47228]: Configuring Open vSwitch system IDs [  OK  ]
Feb  2 04:33:31 np0005604791 ovs-vsctl[47304]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb  2 04:33:31 np0005604791 ovs-ctl[47228]: Enabling remote OVSDB managers [  OK  ]
Feb  2 04:33:31 np0005604791 systemd[1]: Started Open vSwitch Database Unit.
Feb  2 04:33:31 np0005604791 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb  2 04:33:31 np0005604791 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb  2 04:33:31 np0005604791 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb  2 04:33:31 np0005604791 kernel: openvswitch: Open vSwitch switching datapath
Feb  2 04:33:31 np0005604791 ovs-ctl[47349]: Inserting openvswitch module [  OK  ]
Feb  2 04:33:31 np0005604791 ovs-ctl[47318]: Starting ovs-vswitchd [  OK  ]
Feb  2 04:33:31 np0005604791 ovs-vsctl[47369]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb  2 04:33:31 np0005604791 ovs-ctl[47318]: Enabling remote OVSDB managers [  OK  ]
Feb  2 04:33:31 np0005604791 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb  2 04:33:31 np0005604791 systemd[1]: Starting Open vSwitch...
Feb  2 04:33:31 np0005604791 systemd[1]: Finished Open vSwitch.
Feb  2 04:33:32 np0005604791 python3.9[47521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:33:33 np0005604791 python3.9[47673]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb  2 04:33:34 np0005604791 kernel: SELinux:  Converting 2754 SID table entries...
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:33:34 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:33:35 np0005604791 python3.9[47828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:33:36 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb  2 04:33:36 np0005604791 python3.9[47986]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:33:38 np0005604791 python3.9[48139]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:33:39 np0005604791 python3.9[48426]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb  2 04:33:40 np0005604791 python3.9[48576]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:33:41 np0005604791 python3.9[48730]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:33:43 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:33:43 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:33:43 np0005604791 systemd[1]: Reloading.
Feb  2 04:33:43 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:33:43 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:33:43 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:33:43 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:33:43 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:33:43 np0005604791 systemd[1]: run-raf11a8c34f8349dba4fb96444987709c.service: Deactivated successfully.
Feb  2 04:33:44 np0005604791 python3.9[49047]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:33:44 np0005604791 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb  2 04:33:44 np0005604791 systemd[1]: Stopped Network Manager Wait Online.
Feb  2 04:33:44 np0005604791 systemd[1]: Stopping Network Manager Wait Online...
Feb  2 04:33:44 np0005604791 systemd[1]: Stopping Network Manager...
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6651] caught SIGTERM, shutting down normally.
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): canceled DHCP transaction
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): state changed no lease
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6671] manager: NetworkManager state is now CONNECTED_SITE
Feb  2 04:33:44 np0005604791 NetworkManager[7213]: <info>  [1770024824.6739] exiting (success)
Feb  2 04:33:44 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:33:44 np0005604791 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb  2 04:33:44 np0005604791 systemd[1]: Stopped Network Manager.
Feb  2 04:33:44 np0005604791 systemd[1]: NetworkManager.service: Consumed 14.564s CPU time, 4.1M memory peak, read 0B from disk, written 34.5K to disk.
Feb  2 04:33:44 np0005604791 systemd[1]: Starting Network Manager...
Feb  2 04:33:44 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.7235] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.7236] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.7286] manager[0x55ab2a03f000]: monitoring kernel firmware directory '/lib/firmware'.
Feb  2 04:33:44 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 04:33:44 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8143] hostname: hostname: using hostnamed
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8144] hostname: static hostname changed from (none) to "compute-1"
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8151] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8157] manager[0x55ab2a03f000]: rfkill: Wi-Fi hardware radio set enabled
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8157] manager[0x55ab2a03f000]: rfkill: WWAN hardware radio set enabled
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8187] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8201] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8202] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8203] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8203] manager: Networking is enabled by state file
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8206] settings: Loaded settings plugin: keyfile (internal)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8211] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8248] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8261] dhcp: init: Using DHCP client 'internal'
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8265] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8272] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8280] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8291] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8299] device (eth0): carrier: link connected
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8305] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8312] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8313] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8321] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8330] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8337] device (eth1): carrier: link connected
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8343] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8349] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50) (indicated)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8350] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8357] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8367] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb  2 04:33:44 np0005604791 systemd[1]: Started Network Manager.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8373] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8390] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8394] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8398] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8408] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8413] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8417] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8420] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8425] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8435] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8439] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8450] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8472] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8487] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8492] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8498] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8507] device (lo): Activation: successful, device activated.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8522] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb  2 04:33:44 np0005604791 systemd[1]: Starting Network Manager Wait Online...
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8596] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8602] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8609] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8613] manager: NetworkManager state is now CONNECTED_LOCAL
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8617] device (eth1): Activation: successful, device activated.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8628] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8629] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8634] manager: NetworkManager state is now CONNECTED_SITE
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8636] device (eth0): Activation: successful, device activated.
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8644] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb  2 04:33:44 np0005604791 NetworkManager[49055]: <info>  [1770024824.8648] manager: startup complete
Feb  2 04:33:44 np0005604791 systemd[1]: Finished Network Manager Wait Online.
Feb  2 04:33:45 np0005604791 python3.9[49273]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:33:49 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:33:49 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:33:49 np0005604791 systemd[1]: Reloading.
Feb  2 04:33:49 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:33:49 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:33:49 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:33:50 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:33:50 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:33:50 np0005604791 systemd[1]: run-rddc579ad61724063b6b170ff9fb0ec23.service: Deactivated successfully.
Feb  2 04:33:51 np0005604791 python3.9[49732]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:33:52 np0005604791 python3.9[49884]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:53 np0005604791 python3.9[50038]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:53 np0005604791 python3.9[50190]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:54 np0005604791 python3.9[50342]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:54 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:33:55 np0005604791 python3.9[50494]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:55 np0005604791 python3.9[50646]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:33:56 np0005604791 python3.9[50769]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024835.1926363-643-195722698107248/.source _original_basename=.n50w5qmt follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:57 np0005604791 python3.9[50921]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:33:58 np0005604791 python3.9[51073]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb  2 04:33:58 np0005604791 python3.9[51225]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:34:01 np0005604791 python3.9[51652]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb  2 04:34:02 np0005604791 ansible-async_wrapper.py[51827]: Invoked with j446185006143 300 /home/zuul/.ansible/tmp/ansible-tmp-1770024841.3594537-841-228079003665104/AnsiballZ_edpm_os_net_config.py _
Feb  2 04:34:02 np0005604791 ansible-async_wrapper.py[51830]: Starting module and watcher
Feb  2 04:34:02 np0005604791 ansible-async_wrapper.py[51830]: Start watching 51831 (300)
Feb  2 04:34:02 np0005604791 ansible-async_wrapper.py[51831]: Start module (51831)
Feb  2 04:34:02 np0005604791 ansible-async_wrapper.py[51827]: Return async_wrapper task started.
Feb  2 04:34:02 np0005604791 python3.9[51832]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Feb  2 04:34:03 np0005604791 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb  2 04:34:03 np0005604791 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb  2 04:34:03 np0005604791 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb  2 04:34:03 np0005604791 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb  2 04:34:03 np0005604791 kernel: cfg80211: failed to load regulatory.db
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.3270] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.3297] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4017] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4019] audit: op="connection-add" uuid="12147afe-840c-4d71-9d90-87848bae2322" name="br-ex-br" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4036] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4037] audit: op="connection-add" uuid="64011fed-cfe7-4a87-81c3-73a14c673061" name="br-ex-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4051] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4053] audit: op="connection-add" uuid="64571508-7a4f-4b5e-89c5-3e1f65b65a54" name="eth1-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4066] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4068] audit: op="connection-add" uuid="1bb4e91c-0132-453d-a16f-0b09c1c65047" name="vlan20-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4080] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4082] audit: op="connection-add" uuid="ebf6e626-46d2-4e7b-ac53-74cb51313eba" name="vlan21-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4097] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4098] audit: op="connection-add" uuid="0b57298e-2470-4dd2-a204-40cf2c152f2d" name="vlan22-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4111] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4113] audit: op="connection-add" uuid="a493b690-bd4a-4885-a502-03c3741c796c" name="vlan23-port" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4137] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4158] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4160] audit: op="connection-add" uuid="5073fcc2-d4a1-4649-b330-3ba11bcafee3" name="br-ex-if" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4206] audit: op="connection-update" uuid="ee27c9b5-5e51-5927-b192-b2b3e6929a50" name="ci-private-network" args="ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.routes,ipv6.method,ipv6.addresses,ovs-external-ids.data,ovs-interface.type,connection.controller,connection.port-type,connection.master,connection.slave-type,connection.timestamp,ipv4.never-default,ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.addresses" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4233] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4234] audit: op="connection-add" uuid="7057c86f-124b-448c-8b89-082381965557" name="vlan20-if" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4253] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4255] audit: op="connection-add" uuid="8bf5f96c-762d-42bf-9c6e-7b928bb9f5de" name="vlan21-if" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4282] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4284] audit: op="connection-add" uuid="c8540e5b-66cb-4359-99d1-b8486c8c24f2" name="vlan22-if" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4313] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4316] audit: op="connection-add" uuid="e90cf2e2-a0e0-4fc9-a1d7-b162c2f7e589" name="vlan23-if" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4334] audit: op="connection-delete" uuid="4b750100-60f6-352c-8e2d-c3f0f09dae3a" name="Wired connection 1" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4354] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4358] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4369] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4376] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (12147afe-840c-4d71-9d90-87848bae2322)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4377] audit: op="connection-activate" uuid="12147afe-840c-4d71-9d90-87848bae2322" name="br-ex-br" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4380] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4381] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4388] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4394] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (64011fed-cfe7-4a87-81c3-73a14c673061)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4396] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4398] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4404] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4409] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (64571508-7a4f-4b5e-89c5-3e1f65b65a54)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4411] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4413] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4420] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4430] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1bb4e91c-0132-453d-a16f-0b09c1c65047)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4435] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4436] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4447] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4457] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ebf6e626-46d2-4e7b-ac53-74cb51313eba)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4460] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4462] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4470] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4475] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0b57298e-2470-4dd2-a204-40cf2c152f2d)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4478] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4479] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4487] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4493] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (a493b690-bd4a-4885-a502-03c3741c796c)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4494] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4498] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4501] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4509] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4510] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4524] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4529] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (5073fcc2-d4a1-4649-b330-3ba11bcafee3)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4529] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4533] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4535] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4536] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4537] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4550] device (eth1): disconnecting for new activation request.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4551] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4554] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4555] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4557] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4560] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4561] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4564] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4568] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7057c86f-124b-448c-8b89-082381965557)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4569] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4572] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4574] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4575] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4578] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4579] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4582] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4586] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (8bf5f96c-762d-42bf-9c6e-7b928bb9f5de)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4587] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4590] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4592] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4594] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4596] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4597] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4600] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4605] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (c8540e5b-66cb-4359-99d1-b8486c8c24f2)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4605] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4608] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4610] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4611] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4614] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <warn>  [1770024844.4615] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4618] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4623] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (e90cf2e2-a0e0-4fc9-a1d7-b162c2f7e589)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4623] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4626] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4628] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4629] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4631] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4647] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4649] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4654] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4656] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4662] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4666] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4671] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4674] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4676] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 kernel: ovs-system: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4680] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4684] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4686] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4688] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4692] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4695] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4698] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4701] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 systemd-udevd[51839]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:34:04 np0005604791 kernel: Timeout policy base is empty
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4707] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4710] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4713] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4714] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4718] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4721] dhcp4 (eth0): canceled DHCP transaction
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4722] dhcp4 (eth0): state changed no lease
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4723] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4733] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4736] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51833 uid=0 result="fail" reason="Device is not activated"
Feb  2 04:34:04 np0005604791 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4770] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4775] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4814] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4833] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4843] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4863] device (eth1): disconnecting for new activation request.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4864] audit: op="connection-activate" uuid="ee27c9b5-5e51-5927-b192-b2b3e6929a50" name="ci-private-network" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4899] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb  2 04:34:04 np0005604791 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.4940] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb  2 04:34:04 np0005604791 kernel: br-ex: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5090] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5097] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5106] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5111] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5118] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5123] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5133] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5136] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5138] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5140] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5142] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5144] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 kernel: vlan22: entered promiscuous mode
Feb  2 04:34:04 np0005604791 systemd-udevd[51838]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5158] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5174] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5179] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 kernel: vlan23: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5184] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5188] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5193] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5197] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5201] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5206] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5210] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5215] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5219] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5224] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5236] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5242] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5257] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb  2 04:34:04 np0005604791 kernel: vlan21: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5274] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5279] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5296] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5302] device (eth1): Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5310] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Feb  2 04:34:04 np0005604791 kernel: vlan20: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5325] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5354] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5369] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5375] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5385] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5393] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5399] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5406] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5408] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5413] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5420] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5425] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5432] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5437] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5451] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5468] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5474] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5478] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5482] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5488] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5489] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb  2 04:34:04 np0005604791 NetworkManager[49055]: <info>  [1770024844.5493] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb  2 04:34:05 np0005604791 NetworkManager[49055]: <info>  [1770024845.6691] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb  2 04:34:05 np0005604791 NetworkManager[49055]: <info>  [1770024845.8340] checkpoint[0x55ab2a014950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb  2 04:34:05 np0005604791 NetworkManager[49055]: <info>  [1770024845.8343] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb  2 04:34:05 np0005604791 python3.9[52190]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=status _async_dir=/root/.ansible_async
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.1883] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.1898] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.4390] audit: op="networking-control" arg="global-dns-configuration" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.4422] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.4452] audit: op="networking-control" arg="global-dns-configuration" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.4487] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.6004] checkpoint[0x55ab2a014a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb  2 04:34:06 np0005604791 NetworkManager[49055]: <info>  [1770024846.6008] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb  2 04:34:06 np0005604791 ansible-async_wrapper.py[51831]: Module complete (51831)
Feb  2 04:34:07 np0005604791 ansible-async_wrapper.py[51830]: Done in kid B.
Feb  2 04:34:09 np0005604791 python3.9[52296]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=status _async_dir=/root/.ansible_async
Feb  2 04:34:10 np0005604791 python3.9[52396]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=cleanup _async_dir=/root/.ansible_async
Feb  2 04:34:11 np0005604791 python3.9[52548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:34:12 np0005604791 python3.9[52671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024851.0908043-922-92139863094412/.source.returncode _original_basename=.3too2mv0 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:34:12 np0005604791 python3.9[52824]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:34:13 np0005604791 python3.9[52947]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024852.4113955-970-24807962649288/.source.cfg _original_basename=.iopzdpeq follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:34:14 np0005604791 python3.9[53099]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:34:14 np0005604791 systemd[1]: Reloading Network Manager...
Feb  2 04:34:14 np0005604791 NetworkManager[49055]: <info>  [1770024854.6484] audit: op="reload" arg="0" pid=53103 uid=0 result="success"
Feb  2 04:34:14 np0005604791 NetworkManager[49055]: <info>  [1770024854.6490] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb  2 04:34:14 np0005604791 systemd[1]: Reloaded Network Manager.
Feb  2 04:34:14 np0005604791 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb  2 04:34:15 np0005604791 systemd[1]: session-11.scope: Deactivated successfully.
Feb  2 04:34:15 np0005604791 systemd[1]: session-11.scope: Consumed 43.624s CPU time.
Feb  2 04:34:15 np0005604791 systemd-logind[805]: Session 11 logged out. Waiting for processes to exit.
Feb  2 04:34:15 np0005604791 systemd-logind[805]: Removed session 11.
Feb  2 04:34:20 np0005604791 systemd-logind[805]: New session 12 of user zuul.
Feb  2 04:34:20 np0005604791 systemd[1]: Started Session 12 of User zuul.
Feb  2 04:34:21 np0005604791 python3.9[53289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:34:22 np0005604791 python3.9[53443]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:34:23 np0005604791 python3.9[53637]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:34:24 np0005604791 systemd[1]: session-12.scope: Deactivated successfully.
Feb  2 04:34:24 np0005604791 systemd[1]: session-12.scope: Consumed 2.101s CPU time.
Feb  2 04:34:24 np0005604791 systemd-logind[805]: Session 12 logged out. Waiting for processes to exit.
Feb  2 04:34:24 np0005604791 systemd-logind[805]: Removed session 12.
Feb  2 04:34:24 np0005604791 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb  2 04:34:29 np0005604791 systemd-logind[805]: New session 13 of user zuul.
Feb  2 04:34:29 np0005604791 systemd[1]: Started Session 13 of User zuul.
Feb  2 04:34:30 np0005604791 python3.9[53819]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:34:31 np0005604791 python3.9[53973]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:34:32 np0005604791 python3.9[54130]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:34:33 np0005604791 python3.9[54214]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:34:35 np0005604791 python3.9[54367]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:34:36 np0005604791 python3.9[54563]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:34:37 np0005604791 python3.9[54715]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:34:37 np0005604791 systemd[1]: var-lib-containers-storage-overlay-compat3221578336-merged.mount: Deactivated successfully.
Feb  2 04:34:37 np0005604791 podman[54716]: 2026-02-02 09:34:37.483190949 +0000 UTC m=+0.064142764 system refresh
Feb  2 04:34:38 np0005604791 python3.9[54879]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:34:38 np0005604791 systemd[1]: var-lib-containers-storage-overlay-opaque\x2dbug\x2dcheck2425735953-merged.mount: Deactivated successfully.
Feb  2 04:34:38 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:34:39 np0005604791 python3.9[55002]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024877.6941218-193-73341524666599/.source.json follow=False _original_basename=podman_network_config.j2 checksum=85b621edfd6b57aa7ba64b670d6d71e13d1e3a57 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:34:39 np0005604791 python3.9[55154]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:34:40 np0005604791 python3.9[55277]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024879.2886827-238-154031085959593/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:34:41 np0005604791 python3.9[55429]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:34:42 np0005604791 python3.9[55581]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:34:42 np0005604791 python3.9[55733]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:34:43 np0005604791 python3.9[55885]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:34:43 np0005604791 python3.9[56037]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:34:46 np0005604791 python3.9[56190]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:34:46 np0005604791 python3.9[56344]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:34:47 np0005604791 python3.9[56496]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:34:48 np0005604791 python3.9[56648]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:34:49 np0005604791 python3.9[56801]: ansible-service_facts Invoked
Feb  2 04:34:49 np0005604791 network[56818]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:34:49 np0005604791 network[56819]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:34:49 np0005604791 network[56820]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:34:54 np0005604791 python3.9[57272]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:34:57 np0005604791 python3.9[57425]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb  2 04:34:58 np0005604791 python3.9[57577]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:34:59 np0005604791 python3.9[57702]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024898.4417396-671-281242515459565/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:00 np0005604791 python3.9[57856]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:00 np0005604791 python3.9[57981]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024899.8367538-716-45054612975642/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:02 np0005604791 python3.9[58135]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:04 np0005604791 python3.9[58289]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:35:05 np0005604791 python3.9[58373]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:06 np0005604791 python3.9[58527]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:35:07 np0005604791 python3.9[58611]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:35:07 np0005604791 chronyd[811]: chronyd exiting
Feb  2 04:35:07 np0005604791 systemd[1]: Stopping NTP client/server...
Feb  2 04:35:07 np0005604791 systemd[1]: chronyd.service: Deactivated successfully.
Feb  2 04:35:07 np0005604791 systemd[1]: Stopped NTP client/server.
Feb  2 04:35:07 np0005604791 systemd[1]: Starting NTP client/server...
Feb  2 04:35:07 np0005604791 chronyd[58619]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb  2 04:35:07 np0005604791 chronyd[58619]: Frequency -26.550 +/- 0.484 ppm read from /var/lib/chrony/drift
Feb  2 04:35:07 np0005604791 chronyd[58619]: Loaded seccomp filter (level 2)
Feb  2 04:35:07 np0005604791 systemd[1]: Started NTP client/server.
Feb  2 04:35:08 np0005604791 systemd[1]: session-13.scope: Deactivated successfully.
Feb  2 04:35:08 np0005604791 systemd[1]: session-13.scope: Consumed 22.583s CPU time.
Feb  2 04:35:08 np0005604791 systemd-logind[805]: Session 13 logged out. Waiting for processes to exit.
Feb  2 04:35:08 np0005604791 systemd-logind[805]: Removed session 13.
Feb  2 04:35:13 np0005604791 systemd-logind[805]: New session 14 of user zuul.
Feb  2 04:35:13 np0005604791 systemd[1]: Started Session 14 of User zuul.
Feb  2 04:35:14 np0005604791 python3.9[58800]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:14 np0005604791 python3.9[58952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:15 np0005604791 python3.9[59075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024914.3400397-58-245572025514354/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:15 np0005604791 systemd[1]: session-14.scope: Deactivated successfully.
Feb  2 04:35:15 np0005604791 systemd[1]: session-14.scope: Consumed 1.480s CPU time.
Feb  2 04:35:15 np0005604791 systemd-logind[805]: Session 14 logged out. Waiting for processes to exit.
Feb  2 04:35:15 np0005604791 systemd-logind[805]: Removed session 14.
Feb  2 04:35:21 np0005604791 systemd-logind[805]: New session 15 of user zuul.
Feb  2 04:35:21 np0005604791 systemd[1]: Started Session 15 of User zuul.
Feb  2 04:35:22 np0005604791 python3.9[59253]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:35:23 np0005604791 python3.9[59409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:24 np0005604791 python3.9[59584]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:24 np0005604791 python3.9[59707]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1770024923.6217854-79-242405258553402/.source.json _original_basename=.q8wsgptc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:25 np0005604791 python3.9[59859]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:26 np0005604791 python3.9[59982]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024925.2686567-148-78079986202256/.source _original_basename=.ek_rd0vo follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:26 np0005604791 python3.9[60134]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:35:27 np0005604791 python3.9[60286]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:28 np0005604791 python3.9[60409]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024927.2232182-220-246681203115187/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:35:28 np0005604791 python3.9[60561]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:29 np0005604791 python3.9[60684]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024928.3833046-220-31431851639660/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:35:30 np0005604791 python3.9[60836]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:30 np0005604791 python3.9[60988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:31 np0005604791 python3.9[61111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024930.4272084-331-43439646552512/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:32 np0005604791 python3.9[61263]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:32 np0005604791 python3.9[61386]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024931.6567607-376-8828196314413/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:33 np0005604791 python3.9[61538]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:33 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:33 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:33 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:33 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:34 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:34 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:34 np0005604791 systemd[1]: Starting EDPM Container Shutdown...
Feb  2 04:35:34 np0005604791 systemd[1]: Finished EDPM Container Shutdown.
Feb  2 04:35:34 np0005604791 python3.9[61766]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:35 np0005604791 python3.9[61889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024934.3337915-445-190825462867451/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:36 np0005604791 python3.9[62041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:36 np0005604791 python3.9[62164]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024935.567769-490-40701539531467/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:37 np0005604791 python3.9[62316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:37 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:37 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:37 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:37 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:37 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:37 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:37 np0005604791 systemd[1]: Starting Create netns directory...
Feb  2 04:35:37 np0005604791 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb  2 04:35:37 np0005604791 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb  2 04:35:37 np0005604791 systemd[1]: Finished Create netns directory.
Feb  2 04:35:38 np0005604791 python3.9[62543]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:35:38 np0005604791 network[62560]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:35:38 np0005604791 network[62561]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:35:38 np0005604791 network[62562]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:35:43 np0005604791 python3.9[62824]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:43 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:43 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:43 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:43 np0005604791 systemd[1]: Stopping IPv4 firewall with iptables...
Feb  2 04:35:43 np0005604791 iptables.init[62864]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb  2 04:35:43 np0005604791 iptables.init[62864]: iptables: Flushing firewall rules: [  OK  ]
Feb  2 04:35:43 np0005604791 systemd[1]: iptables.service: Deactivated successfully.
Feb  2 04:35:43 np0005604791 systemd[1]: Stopped IPv4 firewall with iptables.
Feb  2 04:35:44 np0005604791 python3.9[63060]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:45 np0005604791 python3.9[63214]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:35:45 np0005604791 systemd[1]: Reloading.
Feb  2 04:35:45 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:35:45 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:35:45 np0005604791 systemd[1]: Starting Netfilter Tables...
Feb  2 04:35:45 np0005604791 systemd[1]: Finished Netfilter Tables.
Feb  2 04:35:46 np0005604791 python3.9[63406]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:35:47 np0005604791 python3.9[63559]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:48 np0005604791 python3.9[63684]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024947.1182163-697-176938743890081/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:49 np0005604791 python3.9[63837]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:35:49 np0005604791 systemd[1]: Reloading OpenSSH server daemon...
Feb  2 04:35:49 np0005604791 systemd[1]: Reloaded OpenSSH server daemon.
Feb  2 04:35:49 np0005604791 python3.9[63993]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:50 np0005604791 python3.9[64145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:50 np0005604791 python3.9[64268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024949.9953647-790-95178274619575/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:52 np0005604791 python3.9[64420]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb  2 04:35:52 np0005604791 systemd[1]: Starting Time & Date Service...
Feb  2 04:35:52 np0005604791 systemd[1]: Started Time & Date Service.
Feb  2 04:35:53 np0005604791 python3.9[64576]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:53 np0005604791 python3.9[64728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:54 np0005604791 python3.9[64851]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024953.2336028-895-61303885170294/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:55 np0005604791 python3.9[65003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:55 np0005604791 python3.9[65126]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024954.56732-940-7544299619871/.source.yaml _original_basename=.4bw___po follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:56 np0005604791 python3.9[65278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:35:56 np0005604791 python3.9[65401]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024955.811372-986-118083414410790/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:35:57 np0005604791 python3.9[65553]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:35:58 np0005604791 python3.9[65706]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:35:58 np0005604791 python3[65859]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb  2 04:35:59 np0005604791 python3.9[66011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:36:00 np0005604791 python3.9[66134]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024959.0886939-1102-239301028363261/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:00 np0005604791 python3.9[66286]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:36:01 np0005604791 python3.9[66409]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024960.2624795-1147-280765877889088/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:01 np0005604791 python3.9[66561]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:36:02 np0005604791 python3.9[66684]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024961.3785193-1193-249992278542607/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:03 np0005604791 python3.9[66836]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:36:03 np0005604791 python3.9[66959]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024962.539503-1237-144775975056401/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:04 np0005604791 python3.9[67111]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:36:04 np0005604791 python3.9[67234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024963.721225-1282-246468476647806/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:05 np0005604791 python3.9[67386]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:06 np0005604791 python3.9[67538]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:36:07 np0005604791 python3.9[67697]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:07 np0005604791 python3.9[67850]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:08 np0005604791 python3.9[68002]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:09 np0005604791 python3.9[68154]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb  2 04:36:09 np0005604791 python3.9[68307]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb  2 04:36:10 np0005604791 systemd-logind[805]: Session 15 logged out. Waiting for processes to exit.
Feb  2 04:36:10 np0005604791 systemd[1]: session-15.scope: Deactivated successfully.
Feb  2 04:36:10 np0005604791 systemd[1]: session-15.scope: Consumed 31.215s CPU time.
Feb  2 04:36:10 np0005604791 systemd-logind[805]: Removed session 15.
Feb  2 04:36:15 np0005604791 systemd-logind[805]: New session 16 of user zuul.
Feb  2 04:36:15 np0005604791 systemd[1]: Started Session 16 of User zuul.
Feb  2 04:36:16 np0005604791 python3.9[68488]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb  2 04:36:17 np0005604791 python3.9[68640]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:36:18 np0005604791 python3.9[68792]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:36:19 np0005604791 python3.9[68944]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDTA16t8OsOL4s99BOiNF3vckRPwnc9DwrgEMUjNAF5ofBbR7O7JlFD47GnI33lZr51vVc0wnvTxhpFA0jVvhKqVWdJ3lApNf34bJmaJBr8uiy/i3Q84MsUtXBLQ0FDCbwgaPnreNbMz3ae+u9H+Z73jQSP+gnQ5oYWhONHgO4HHkF8K7a8Bow3H5qwfbHz8o7mFQmTpYHwOcwhA53BTbh1NiEJZJNSg7wi1hH7vELUAzts1cbF2slTE0nh8XjMogq9ukokrCIKfE+xX7PmAawCuMnfvGX93zF1298pGcUKqvpnIfUOMDGtJtYEZ8sWsr5aH1YXIoJfHuux/YosRx3XDD5oEcpX0nYKVW6bumHsFIS199XAM5LtWWNr2eMcrbZhVwHNdELC6zoL7QjbBQ+2j/+8nJLq9vIghewgO3EFWK3r7kIVQZg8GYLZ/yisH4cvzUTACRXAF+1o2rq+AUfX3nTSsrqyZQUwlnWpc1vsceEO0Lsuac5tvGylnsJBfmM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN317jbKb2FNELHPgcKtyDLq5kCgCZN/b/8qYDuirt4l#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNpgfrlTfGut7rGFnGEpIiXrs2U1SQK0Fr1bAmmw8notvdnn6jtGfPfwX96hGwcOu4AlAS/i7X7XgbLw573Ooww=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXvxaVTYbHTHv+9EzKdF3T8+Yr2otW2YLuSqNTF+yJaKACfB7wDlIhKDGTHiU1FDrkO4tJ+R3OL/2ZXoIlxp5JSdCgcb42X+5PTj1wPkayVlQW7e0wQvT3kYhrcPtjLgk4T39/sionMGYUat45idwoB6hUSPLdk/L5+n0/3LEg1lByOM/B1/p8wGzHn6H9CWoIP3Ctd6lmrxtIVU1u+pxiBVQCcMjw5gtqsB54l670fL7El5XEkqjRjKHhylw9QTYN3AWMKuQKwcjClm/57/SoFMP7o52r653wGDH9cpvDgs0RYG4bA1mGY5OMkYbDJfcy0CViKEu5qWW4cTBLh/Z88D2EuNlINj3Q1YJk3RwF6vYl31MMsbBW10YhIiBJrA5XF0BLARqBOZ1e6v7JKTSwa7wGGtRzEzbY+me9zl6ZhhDru/I+h24J4MeBA07HvQIS2v8O95tPz76YZJ3DkWlywFWbALG8M4+fkpuQtvVpBZMgdvIWW0kfXO/grGnrgY8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG3OEs+fDFWrKRKifY4uXYtOpS/6/8E88qPQNs1apj/z#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFy9hRh0QDNcy30491f4FwmL+9BopSuPxbkVyWhY9VytT/FG5rm9/DLYyukpd9IKttcZyerq0gzfokDrht76FB4=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpaaLVd9Gqbxcksz46sKNkp3Eu2TY3fUjtOhbkLQru93qJt/RNDTocNiUrE9VAj/UXp9dZqSHg1Hr7ScqXu7zqgZ9i+mq6N7P7QR+ZkN8jLQSybnPztI7X/QWaPhT0j1ArMrYk2F2Me+kAQiFL0GoR2d8udRElL8YKKIYQ6zjC/h2ZsU0WyVET9uiTgeMP/njtMzRSgO2Wp6no4KqJEOMSEY1lgURjVsMWkTr4hGz523SooA41GzquuNamnj1ELwKZSAH+TtVgI8oFJ2T+5TZiE/oW2MizbBwjKA3V5DlnGOEG49eG+LhZ/eWb6jQ7OnJARA/iLU/FsJ+CaGSbRK20/OWXP4JSZu7liaD0DIHM0DwrjEnQcXI6SbfAoAQ494KFtZvFamem7CPtrVhgNAKqybRbDcEQGpDxQgrWeA3m4HyGIBym+IvMUfYlNke9frCkwNpXRH93TK6E/ziPFrBHKkdRcFxVdsG2u1Y+adxOQk7KCjq/skzXBPCPDaHnzBM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIKtQmhiX/LRkxZONUn47u07V1HNePVW1EWKmTbmuGuY#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE0cPV3BwiB9Cc5Ne48bCCSZwMzF/hH7iFXwAiP/TK2pzWYsdZw1mOSJ+vDu1KclkDtQKmwN6Cu0N7j7domqlzE=#012 create=True mode=0644 path=/tmp/ansible.50jbc6k_ state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:19 np0005604791 python3.9[69096]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.50jbc6k_' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:36:20 np0005604791 python3.9[69250]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.50jbc6k_ state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:21 np0005604791 systemd[1]: session-16.scope: Deactivated successfully.
Feb  2 04:36:21 np0005604791 systemd[1]: session-16.scope: Consumed 2.798s CPU time.
Feb  2 04:36:21 np0005604791 systemd-logind[805]: Session 16 logged out. Waiting for processes to exit.
Feb  2 04:36:21 np0005604791 systemd-logind[805]: Removed session 16.
Feb  2 04:36:22 np0005604791 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb  2 04:36:26 np0005604791 systemd-logind[805]: New session 17 of user zuul.
Feb  2 04:36:26 np0005604791 systemd[1]: Started Session 17 of User zuul.
Feb  2 04:36:27 np0005604791 python3.9[69430]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:36:29 np0005604791 python3.9[69586]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb  2 04:36:29 np0005604791 python3.9[69740]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:36:30 np0005604791 python3.9[69893]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:36:31 np0005604791 python3.9[70046]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:36:32 np0005604791 python3.9[70200]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:36:33 np0005604791 python3.9[70355]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:36:33 np0005604791 systemd[1]: session-17.scope: Deactivated successfully.
Feb  2 04:36:33 np0005604791 systemd[1]: session-17.scope: Consumed 3.779s CPU time.
Feb  2 04:36:33 np0005604791 systemd-logind[805]: Session 17 logged out. Waiting for processes to exit.
Feb  2 04:36:33 np0005604791 systemd-logind[805]: Removed session 17.
Feb  2 04:36:38 np0005604791 systemd-logind[805]: New session 18 of user zuul.
Feb  2 04:36:38 np0005604791 systemd[1]: Started Session 18 of User zuul.
Feb  2 04:36:40 np0005604791 python3.9[70533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:36:41 np0005604791 python3.9[70689]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:36:41 np0005604791 python3.9[70773]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb  2 04:36:43 np0005604791 python3.9[70924]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:36:45 np0005604791 python3.9[71075]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:36:45 np0005604791 python3.9[71225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:36:45 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:36:45 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:36:46 np0005604791 python3.9[71376]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:36:47 np0005604791 systemd[1]: session-18.scope: Deactivated successfully.
Feb  2 04:36:47 np0005604791 systemd[1]: session-18.scope: Consumed 5.128s CPU time.
Feb  2 04:36:47 np0005604791 systemd-logind[805]: Session 18 logged out. Waiting for processes to exit.
Feb  2 04:36:47 np0005604791 systemd-logind[805]: Removed session 18.
Feb  2 04:36:55 np0005604791 systemd-logind[805]: New session 19 of user zuul.
Feb  2 04:36:55 np0005604791 systemd[1]: Started Session 19 of User zuul.
Feb  2 04:37:01 np0005604791 python3[72144]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:37:02 np0005604791 python3[72239]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb  2 04:37:04 np0005604791 python3[72266]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb  2 04:37:04 np0005604791 python3[72292]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:37:04 np0005604791 kernel: loop: module loaded
Feb  2 04:37:04 np0005604791 kernel: loop3: detected capacity change from 0 to 41943040
Feb  2 04:37:04 np0005604791 python3[72327]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:37:05 np0005604791 lvm[72330]: PV /dev/loop3 not used.
Feb  2 04:37:05 np0005604791 lvm[72332]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:37:05 np0005604791 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Feb  2 04:37:05 np0005604791 lvm[72335]:  1 logical volume(s) in volume group "ceph_vg0" now active
Feb  2 04:37:05 np0005604791 lvm[72342]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:37:05 np0005604791 lvm[72342]: VG ceph_vg0 finished
Feb  2 04:37:05 np0005604791 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Feb  2 04:37:05 np0005604791 python3[72420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb  2 04:37:06 np0005604791 python3[72493]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025025.5627344-36892-11820083717439/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:37:06 np0005604791 python3[72543]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:37:06 np0005604791 systemd[1]: Reloading.
Feb  2 04:37:06 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:37:06 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:37:07 np0005604791 systemd[1]: Starting Ceph OSD losetup...
Feb  2 04:37:07 np0005604791 bash[72582]: /dev/loop3: [64513]:4329557 (/var/lib/ceph-osd-0.img)
Feb  2 04:37:07 np0005604791 systemd[1]: Finished Ceph OSD losetup.
Feb  2 04:37:07 np0005604791 lvm[72583]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:37:07 np0005604791 lvm[72583]: VG ceph_vg0 finished
Feb  2 04:37:09 np0005604791 python3[72607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:37:16 np0005604791 chronyd[58619]: Selected source 142.4.192.253 (pool.ntp.org)
Feb  2 04:38:28 np0005604791 systemd-logind[805]: New session 20 of user ceph-admin.
Feb  2 04:38:28 np0005604791 systemd[1]: Created slice User Slice of UID 42477.
Feb  2 04:38:28 np0005604791 systemd[1]: Starting User Runtime Directory /run/user/42477...
Feb  2 04:38:28 np0005604791 systemd[1]: Finished User Runtime Directory /run/user/42477.
Feb  2 04:38:28 np0005604791 systemd[1]: Starting User Manager for UID 42477...
Feb  2 04:38:28 np0005604791 systemd[72655]: Queued start job for default target Main User Target.
Feb  2 04:38:28 np0005604791 systemd[72655]: Created slice User Application Slice.
Feb  2 04:38:28 np0005604791 systemd[72655]: Started Mark boot as successful after the user session has run 2 minutes.
Feb  2 04:38:28 np0005604791 systemd[72655]: Started Daily Cleanup of User's Temporary Directories.
Feb  2 04:38:28 np0005604791 systemd[72655]: Reached target Paths.
Feb  2 04:38:28 np0005604791 systemd[72655]: Reached target Timers.
Feb  2 04:38:28 np0005604791 systemd[72655]: Starting D-Bus User Message Bus Socket...
Feb  2 04:38:28 np0005604791 systemd[72655]: Starting Create User's Volatile Files and Directories...
Feb  2 04:38:28 np0005604791 systemd-logind[805]: New session 22 of user ceph-admin.
Feb  2 04:38:28 np0005604791 systemd[72655]: Finished Create User's Volatile Files and Directories.
Feb  2 04:38:28 np0005604791 systemd[72655]: Listening on D-Bus User Message Bus Socket.
Feb  2 04:38:28 np0005604791 systemd[72655]: Reached target Sockets.
Feb  2 04:38:28 np0005604791 systemd[72655]: Reached target Basic System.
Feb  2 04:38:28 np0005604791 systemd[72655]: Reached target Main User Target.
Feb  2 04:38:28 np0005604791 systemd[72655]: Startup finished in 101ms.
Feb  2 04:38:28 np0005604791 systemd[1]: Started User Manager for UID 42477.
Feb  2 04:38:28 np0005604791 systemd[1]: Started Session 20 of User ceph-admin.
Feb  2 04:38:28 np0005604791 systemd[1]: Started Session 22 of User ceph-admin.
Feb  2 04:38:28 np0005604791 systemd-logind[805]: New session 23 of user ceph-admin.
Feb  2 04:38:28 np0005604791 systemd[1]: Started Session 23 of User ceph-admin.
Feb  2 04:38:29 np0005604791 systemd-logind[805]: New session 24 of user ceph-admin.
Feb  2 04:38:29 np0005604791 systemd[1]: Started Session 24 of User ceph-admin.
Feb  2 04:38:29 np0005604791 systemd-logind[805]: New session 25 of user ceph-admin.
Feb  2 04:38:29 np0005604791 systemd[1]: Started Session 25 of User ceph-admin.
Feb  2 04:38:29 np0005604791 systemd-logind[805]: New session 26 of user ceph-admin.
Feb  2 04:38:29 np0005604791 systemd[1]: Started Session 26 of User ceph-admin.
Feb  2 04:38:30 np0005604791 systemd-logind[805]: New session 27 of user ceph-admin.
Feb  2 04:38:30 np0005604791 systemd[1]: Started Session 27 of User ceph-admin.
Feb  2 04:38:30 np0005604791 systemd-logind[805]: New session 28 of user ceph-admin.
Feb  2 04:38:30 np0005604791 systemd[1]: Started Session 28 of User ceph-admin.
Feb  2 04:38:30 np0005604791 systemd-logind[805]: New session 29 of user ceph-admin.
Feb  2 04:38:30 np0005604791 systemd[1]: Started Session 29 of User ceph-admin.
Feb  2 04:38:31 np0005604791 systemd-logind[805]: New session 30 of user ceph-admin.
Feb  2 04:38:31 np0005604791 systemd[1]: Started Session 30 of User ceph-admin.
Feb  2 04:38:32 np0005604791 systemd-logind[805]: New session 31 of user ceph-admin.
Feb  2 04:38:32 np0005604791 systemd[1]: Started Session 31 of User ceph-admin.
Feb  2 04:38:32 np0005604791 systemd-logind[805]: New session 32 of user ceph-admin.
Feb  2 04:38:32 np0005604791 systemd[1]: Started Session 32 of User ceph-admin.
Feb  2 04:38:33 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:33 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:34 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:34 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:34 np0005604791 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73229 (sysctl)
Feb  2 04:38:34 np0005604791 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb  2 04:38:34 np0005604791 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb  2 04:38:35 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:35 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:35 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:37 np0005604791 systemd[1]: var-lib-containers-storage-overlay-compat3689025887-lower\x2dmapped.mount: Deactivated successfully.
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.459555545 +0000 UTC m=+14.190186850 container create b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Feb  2 04:38:49 np0005604791 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2021824710-merged.mount: Deactivated successfully.
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.443523323 +0000 UTC m=+14.174154658 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:49 np0005604791 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb  2 04:38:49 np0005604791 systemd[1]: Started libpod-conmon-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope.
Feb  2 04:38:49 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.585478182 +0000 UTC m=+14.316109507 container init b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.59285669 +0000 UTC m=+14.323488035 container start b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:49 np0005604791 sad_perlman[73469]: 167 167
Feb  2 04:38:49 np0005604791 systemd[1]: libpod-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope: Deactivated successfully.
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.601786962 +0000 UTC m=+14.332418267 container attach b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.602556013 +0000 UTC m=+14.333187348 container died b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb  2 04:38:49 np0005604791 systemd[1]: var-lib-containers-storage-overlay-a43cdac388ff540a4372b0bdff5ee97b6306dab70c2c010bf1ce6716bedd5b0c-merged.mount: Deactivated successfully.
Feb  2 04:38:49 np0005604791 podman[73406]: 2026-02-02 09:38:49.643493596 +0000 UTC m=+14.374124941 container remove b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Feb  2 04:38:49 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:49 np0005604791 systemd[1]: libpod-conmon-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope: Deactivated successfully.
Feb  2 04:38:49 np0005604791 podman[73496]: 2026-02-02 09:38:49.816523311 +0000 UTC m=+0.090944883 container create 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:38:49 np0005604791 podman[73496]: 2026-02-02 09:38:49.75331668 +0000 UTC m=+0.027738332 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:49 np0005604791 systemd[1]: Started libpod-conmon-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope.
Feb  2 04:38:49 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:38:49 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:49 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:49 np0005604791 podman[73496]: 2026-02-02 09:38:49.882979183 +0000 UTC m=+0.157400785 container init 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:38:49 np0005604791 podman[73496]: 2026-02-02 09:38:49.889019823 +0000 UTC m=+0.163441405 container start 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:49 np0005604791 podman[73496]: 2026-02-02 09:38:49.892605064 +0000 UTC m=+0.167026666 container attach 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]: [
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:    {
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "available": false,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "being_replaced": false,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "ceph_device_lvm": false,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "device_id": "QEMU_DVD-ROM_QM00001",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "lsm_data": {},
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "lvs": [],
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "path": "/dev/sr0",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "rejected_reasons": [
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "Has a FileSystem",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "Insufficient space (<5GB)"
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        ],
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        "sys_api": {
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "actuators": null,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "device_nodes": [
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:                "sr0"
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            ],
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "devname": "sr0",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "human_readable_size": "482.00 KB",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "id_bus": "ata",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "model": "QEMU DVD-ROM",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "nr_requests": "2",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "parent": "/dev/sr0",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "partitions": {},
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "path": "/dev/sr0",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "removable": "1",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "rev": "2.5+",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "ro": "0",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "rotational": "1",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "sas_address": "",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "sas_device_handle": "",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "scheduler_mode": "mq-deadline",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "sectors": 0,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "sectorsize": "2048",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "size": 493568.0,
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "support_discard": "2048",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "type": "disk",
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:            "vendor": "QEMU"
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:        }
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]:    }
Feb  2 04:38:50 np0005604791 sharp_elgamal[73513]: ]
Feb  2 04:38:50 np0005604791 systemd[1]: libpod-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope: Deactivated successfully.
Feb  2 04:38:50 np0005604791 podman[74505]: 2026-02-02 09:38:50.669286233 +0000 UTC m=+0.036855009 container died 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:38:50 np0005604791 systemd[1]: var-lib-containers-storage-overlay-84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0-merged.mount: Deactivated successfully.
Feb  2 04:38:50 np0005604791 podman[74505]: 2026-02-02 09:38:50.701568762 +0000 UTC m=+0.069137468 container remove 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:38:50 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:50 np0005604791 systemd[1]: libpod-conmon-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope: Deactivated successfully.
Feb  2 04:38:53 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:53 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.453821064 +0000 UTC m=+0.053079916 container create acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Feb  2 04:38:53 np0005604791 systemd[1]: Started libpod-conmon-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope.
Feb  2 04:38:53 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.423039477 +0000 UTC m=+0.022298349 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.541437472 +0000 UTC m=+0.140696344 container init acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.548841281 +0000 UTC m=+0.148100143 container start acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:38:53 np0005604791 busy_engelbart[75517]: 167 167
Feb  2 04:38:53 np0005604791 systemd[1]: libpod-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope: Deactivated successfully.
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.559368977 +0000 UTC m=+0.158627849 container attach acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.559719237 +0000 UTC m=+0.158978089 container died acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:38:53 np0005604791 podman[75500]: 2026-02-02 09:38:53.616138356 +0000 UTC m=+0.215397208 container remove acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:38:53 np0005604791 systemd[1]: libpod-conmon-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope: Deactivated successfully.
Feb  2 04:38:53 np0005604791 systemd[1]: Reloading.
Feb  2 04:38:53 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:38:53 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:38:53 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:53 np0005604791 systemd[1]: Reloading.
Feb  2 04:38:53 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:38:53 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:38:54 np0005604791 systemd[1]: Reached target All Ceph clusters and services.
Feb  2 04:38:54 np0005604791 systemd[1]: Reloading.
Feb  2 04:38:54 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:38:54 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:38:54 np0005604791 systemd[1]: Reached target Ceph cluster d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:38:54 np0005604791 systemd[1]: Reloading.
Feb  2 04:38:54 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:38:54 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:38:54 np0005604791 systemd[1]: Reloading.
Feb  2 04:38:54 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:38:54 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:38:54 np0005604791 systemd[1]: Created slice Slice /system/ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:38:54 np0005604791 systemd[1]: Reached target System Time Set.
Feb  2 04:38:54 np0005604791 systemd[1]: Reached target System Time Synchronized.
Feb  2 04:38:54 np0005604791 systemd[1]: Starting Ceph crash.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:38:54 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:54 np0005604791 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb  2 04:38:54 np0005604791 podman[75771]: 2026-02-02 09:38:54.959458037 +0000 UTC m=+0.044369041 container create 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb  2 04:38:55 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:55 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:55 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:55 np0005604791 podman[75771]: 2026-02-02 09:38:55.027942826 +0000 UTC m=+0.112853810 container init 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Feb  2 04:38:55 np0005604791 podman[75771]: 2026-02-02 09:38:54.938644351 +0000 UTC m=+0.023555355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:55 np0005604791 podman[75771]: 2026-02-02 09:38:55.040589533 +0000 UTC m=+0.125500497 container start 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:55 np0005604791 bash[75771]: 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e
Feb  2 04:38:55 np0005604791 systemd[1]: Started Ceph crash.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: INFO:ceph-crash:pinging cluster to exercise our key
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.228+0000 7f00ded2e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 AuthRegistry(0x7f00d8069490) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 AuthRegistry(0x7f00ded2cff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.231+0000 7f00dcaa3640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.231+0000 7f00ded2e640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: [errno 13] RADOS permission denied (error connecting to the cluster)
Feb  2 04:38:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.609914271 +0000 UTC m=+0.052808409 container create 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:55 np0005604791 systemd[1]: Started libpod-conmon-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope.
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.576928352 +0000 UTC m=+0.019822510 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:55 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.706319847 +0000 UTC m=+0.149213975 container init 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.714634111 +0000 UTC m=+0.157528259 container start 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Feb  2 04:38:55 np0005604791 blissful_bardeen[75909]: 167 167
Feb  2 04:38:55 np0005604791 systemd[1]: libpod-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope: Deactivated successfully.
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.721979208 +0000 UTC m=+0.164873326 container attach 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.722272496 +0000 UTC m=+0.165166614 container died 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb  2 04:38:55 np0005604791 systemd[1]: var-lib-containers-storage-overlay-8f9d9cfcf40122678e6ff3cf722977bbe9582d13cc4c48bfcc339b56dfc8d874-merged.mount: Deactivated successfully.
Feb  2 04:38:55 np0005604791 podman[75893]: 2026-02-02 09:38:55.852430723 +0000 UTC m=+0.295324841 container remove 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:38:55 np0005604791 systemd[1]: libpod-conmon-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope: Deactivated successfully.
Feb  2 04:38:55 np0005604791 podman[75933]: 2026-02-02 09:38:55.98049393 +0000 UTC m=+0.041830809 container create 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2)
Feb  2 04:38:56 np0005604791 systemd[1]: Started libpod-conmon-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope.
Feb  2 04:38:56 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:38:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:38:56 np0005604791 podman[75933]: 2026-02-02 09:38:55.961136485 +0000 UTC m=+0.022473364 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:38:56 np0005604791 podman[75933]: 2026-02-02 09:38:56.069890659 +0000 UTC m=+0.131227548 container init 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:38:56 np0005604791 podman[75933]: 2026-02-02 09:38:56.076916797 +0000 UTC m=+0.138253646 container start 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:38:56 np0005604791 podman[75933]: 2026-02-02 09:38:56.08024266 +0000 UTC m=+0.141579539 container attach 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid)
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: --> passed data devices: 0 physical, 1 LVM
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 273baa6d-671d-41d3-8896-5eac2274aa10
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb  2 04:38:56 np0005604791 lvm[76010]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:38:56 np0005604791 lvm[76010]: VG ceph_vg0 finished
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb  2 04:38:56 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Feb  2 04:38:57 np0005604791 sweet_mirzakhani[75949]: stderr: got monmap epoch 1
Feb  2 04:38:57 np0005604791 sweet_mirzakhani[75949]: --> Creating keyring file for osd.0
Feb  2 04:38:57 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Feb  2 04:38:57 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Feb  2 04:38:57 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 273baa6d-671d-41d3-8896-5eac2274aa10 --setuser ceph --setgroup ceph
Feb  2 04:39:00 np0005604791 sweet_mirzakhani[75949]: stderr: 2026-02-02T09:38:57.425+0000 7f9d27a7f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Feb  2 04:39:00 np0005604791 sweet_mirzakhani[75949]: stderr: 2026-02-02T09:38:57.688+0000 7f9d27a7f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Feb  2 04:39:00 np0005604791 sweet_mirzakhani[75949]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Feb  2 04:39:00 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:00 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: --> ceph-volume lvm activate successful for osd ID: 0
Feb  2 04:39:01 np0005604791 sweet_mirzakhani[75949]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Feb  2 04:39:01 np0005604791 systemd[1]: libpod-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Deactivated successfully.
Feb  2 04:39:01 np0005604791 systemd[1]: libpod-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Consumed 1.839s CPU time.
Feb  2 04:39:01 np0005604791 podman[75933]: 2026-02-02 09:39:01.138893693 +0000 UTC m=+5.200230542 container died 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:01 np0005604791 systemd[1]: var-lib-containers-storage-overlay-cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d-merged.mount: Deactivated successfully.
Feb  2 04:39:01 np0005604791 podman[75933]: 2026-02-02 09:39:01.19415015 +0000 UTC m=+5.255486999 container remove 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Feb  2 04:39:01 np0005604791 systemd[1]: libpod-conmon-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Deactivated successfully.
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.775857665 +0000 UTC m=+0.037717423 container create 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:39:01 np0005604791 systemd[1]: Started libpod-conmon-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope.
Feb  2 04:39:01 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.852666099 +0000 UTC m=+0.114525907 container init 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.758365063 +0000 UTC m=+0.020224811 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.861784956 +0000 UTC m=+0.123644704 container start 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.866456798 +0000 UTC m=+0.128316546 container attach 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb  2 04:39:01 np0005604791 compassionate_solomon[77047]: 167 167
Feb  2 04:39:01 np0005604791 systemd[1]: libpod-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope: Deactivated successfully.
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.868226038 +0000 UTC m=+0.130085786 container died 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb  2 04:39:01 np0005604791 systemd[1]: var-lib-containers-storage-overlay-5dcd08a1eb775db66759cd5e261e2f3cf23a30cd2f7ce62f711a278766950711-merged.mount: Deactivated successfully.
Feb  2 04:39:01 np0005604791 podman[77031]: 2026-02-02 09:39:01.902230315 +0000 UTC m=+0.164090033 container remove 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb  2 04:39:01 np0005604791 systemd[1]: libpod-conmon-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope: Deactivated successfully.
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.040057078 +0000 UTC m=+0.042996582 container create 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb  2 04:39:02 np0005604791 systemd[1]: Started libpod-conmon-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope.
Feb  2 04:39:02 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:02 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:02 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:02 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:02 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.022820813 +0000 UTC m=+0.025760367 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.145451647 +0000 UTC m=+0.148391181 container init 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.153762521 +0000 UTC m=+0.156702035 container start 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True)
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.157904008 +0000 UTC m=+0.160843582 container attach 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]: {
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:    "0": [
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:        {
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "devices": [
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "/dev/loop3"
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            ],
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "lv_name": "ceph_lv0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "lv_size": "21470642176",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=d241d473-9fcb-5f74-b163-f1ca4454e7f1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=273baa6d-671d-41d3-8896-5eac2274aa10,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "lv_uuid": "ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "name": "ceph_lv0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "path": "/dev/ceph_vg0/ceph_lv0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "tags": {
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.block_uuid": "ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.cephx_lockbox_secret": "",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.cluster_fsid": "d241d473-9fcb-5f74-b163-f1ca4454e7f1",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.cluster_name": "ceph",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.crush_device_class": "",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.encrypted": "0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.osd_fsid": "273baa6d-671d-41d3-8896-5eac2274aa10",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.osd_id": "0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.osdspec_affinity": "default_drive_group",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.type": "block",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.vdo": "0",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:                "ceph.with_tpm": "0"
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            },
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "type": "block",
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:            "vg_name": "ceph_vg0"
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:        }
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]:    ]
Feb  2 04:39:02 np0005604791 suspicious_tu[77087]: }
Feb  2 04:39:02 np0005604791 systemd[1]: libpod-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope: Deactivated successfully.
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.455360867 +0000 UTC m=+0.458300411 container died 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb  2 04:39:02 np0005604791 systemd[1]: var-lib-containers-storage-overlay-52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263-merged.mount: Deactivated successfully.
Feb  2 04:39:02 np0005604791 podman[77071]: 2026-02-02 09:39:02.499976014 +0000 UTC m=+0.502915558 container remove 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Feb  2 04:39:02 np0005604791 systemd[1]: libpod-conmon-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.047137568 +0000 UTC m=+0.042483188 container create 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:03 np0005604791 systemd[1]: Started libpod-conmon-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope.
Feb  2 04:39:03 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.024935843 +0000 UTC m=+0.020281523 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.129335644 +0000 UTC m=+0.124681234 container init 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.136625699 +0000 UTC m=+0.131971319 container start 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.140507888 +0000 UTC m=+0.135853568 container attach 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:39:03 np0005604791 jolly_lederberg[77215]: 167 167
Feb  2 04:39:03 np0005604791 systemd[1]: libpod-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.142117654 +0000 UTC m=+0.137463284 container died 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Feb  2 04:39:03 np0005604791 systemd[1]: var-lib-containers-storage-overlay-f6fcf546da33ddb409fc05eee577bcd92b594c3aa6eac9e11db1c1c417e2879b-merged.mount: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77199]: 2026-02-02 09:39:03.178815697 +0000 UTC m=+0.174161297 container remove 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:39:03 np0005604791 systemd[1]: libpod-conmon-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.451502289 +0000 UTC m=+0.052204002 container create 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:39:03 np0005604791 systemd[1]: Started libpod-conmon-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope.
Feb  2 04:39:03 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:03 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:03 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:03 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:03 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:03 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.430318572 +0000 UTC m=+0.031020375 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.534359713 +0000 UTC m=+0.135061526 container init 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.543525051 +0000 UTC m=+0.144226784 container start 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.547466762 +0000 UTC m=+0.148168545 container attach 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:39:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Feb  2 04:39:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]:                            [--no-systemd] [--no-tmpfs]
Feb  2 04:39:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb  2 04:39:03 np0005604791 systemd[1]: libpod-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.730635172 +0000 UTC m=+0.331336905 container died 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Feb  2 04:39:03 np0005604791 systemd[1]: var-lib-containers-storage-overlay-a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd-merged.mount: Deactivated successfully.
Feb  2 04:39:03 np0005604791 podman[77247]: 2026-02-02 09:39:03.77174386 +0000 UTC m=+0.372445563 container remove 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb  2 04:39:03 np0005604791 systemd[1]: libpod-conmon-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope: Deactivated successfully.
Feb  2 04:39:03 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:04 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:04 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:04 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:04 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:04 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:04 np0005604791 systemd[1]: Starting Ceph osd.0 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:39:04 np0005604791 podman[77426]: 2026-02-02 09:39:04.669777378 +0000 UTC m=+0.070494467 container create ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Feb  2 04:39:04 np0005604791 podman[77426]: 2026-02-02 09:39:04.622958019 +0000 UTC m=+0.023675088 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:04 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:04 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:04 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:04 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:04 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:04 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:04 np0005604791 podman[77426]: 2026-02-02 09:39:04.781873956 +0000 UTC m=+0.182591035 container init ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Feb  2 04:39:04 np0005604791 podman[77426]: 2026-02-02 09:39:04.788984516 +0000 UTC m=+0.189701605 container start ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Feb  2 04:39:04 np0005604791 podman[77426]: 2026-02-02 09:39:04.798985628 +0000 UTC m=+0.199702707 container attach ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb  2 04:39:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:04 np0005604791 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:04 np0005604791 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:05 np0005604791 lvm[77522]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:39:05 np0005604791 lvm[77522]: VG ceph_vg0 finished
Feb  2 04:39:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: --> Failed to activate via raw: did not find any matching OSD to activate
Feb  2 04:39:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:05 np0005604791 bash[77426]: --> Failed to activate via raw: did not find any matching OSD to activate
Feb  2 04:39:05 np0005604791 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:05 np0005604791 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb  2 04:39:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:05 np0005604791 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb  2 04:39:05 np0005604791 bash[77426]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb  2 04:39:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 bash[77426]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 bash[77426]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb  2 04:39:06 np0005604791 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb  2 04:39:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:06 np0005604791 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb  2 04:39:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: --> ceph-volume lvm activate successful for osd ID: 0
Feb  2 04:39:06 np0005604791 bash[77426]: --> ceph-volume lvm activate successful for osd ID: 0
Feb  2 04:39:06 np0005604791 systemd[1]: libpod-ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785.scope: Deactivated successfully.
Feb  2 04:39:06 np0005604791 podman[77426]: 2026-02-02 09:39:06.108599609 +0000 UTC m=+1.509316668 container died ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb  2 04:39:06 np0005604791 systemd[1]: libpod-ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785.scope: Consumed 1.306s CPU time.
Feb  2 04:39:06 np0005604791 systemd[1]: var-lib-containers-storage-overlay-1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac-merged.mount: Deactivated successfully.
Feb  2 04:39:06 np0005604791 podman[77426]: 2026-02-02 09:39:06.144872881 +0000 UTC m=+1.545589930 container remove ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:06 np0005604791 podman[77671]: 2026-02-02 09:39:06.34186864 +0000 UTC m=+0.044013840 container create 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:39:06 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:06 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:06 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:06 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:06 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:06 np0005604791 podman[77671]: 2026-02-02 09:39:06.402893229 +0000 UTC m=+0.105038469 container init 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:39:06 np0005604791 podman[77671]: 2026-02-02 09:39:06.316916607 +0000 UTC m=+0.019061817 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:06 np0005604791 podman[77671]: 2026-02-02 09:39:06.416179634 +0000 UTC m=+0.118324844 container start 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Feb  2 04:39:06 np0005604791 bash[77671]: 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3
Feb  2 04:39:06 np0005604791 systemd[1]: Started Ceph osd.0 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: pidfile_write: ignore empty --pid-file
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:06 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:06 np0005604791 podman[77801]: 2026-02-02 09:39:06.971889568 +0000 UTC m=+0.051718128 container create bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:07 np0005604791 systemd[1]: Started libpod-conmon-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope.
Feb  2 04:39:07 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:07.041056027 +0000 UTC m=+0.120884607 container init bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:06.94993469 +0000 UTC m=+0.029763300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:07.052565791 +0000 UTC m=+0.132394391 container start bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:07.056862132 +0000 UTC m=+0.136690722 container attach bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2)
Feb  2 04:39:07 np0005604791 cool_morse[77817]: 167 167
Feb  2 04:39:07 np0005604791 systemd[1]: libpod-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope: Deactivated successfully.
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:07.058358774 +0000 UTC m=+0.138187374 container died bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:39:07 np0005604791 systemd[1]: var-lib-containers-storage-overlay-803654d395faadb15789c9ab5627dbe13a389df7419e35874d4d864aaf9d2268-merged.mount: Deactivated successfully.
Feb  2 04:39:07 np0005604791 podman[77801]: 2026-02-02 09:39:07.113374604 +0000 UTC m=+0.193203204 container remove bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Feb  2 04:39:07 np0005604791 systemd[1]: libpod-conmon-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope: Deactivated successfully.
Feb  2 04:39:07 np0005604791 podman[77844]: 2026-02-02 09:39:07.287250602 +0000 UTC m=+0.061235226 container create 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:07 np0005604791 systemd[1]: Started libpod-conmon-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope.
Feb  2 04:39:07 np0005604791 podman[77844]: 2026-02-02 09:39:07.261787355 +0000 UTC m=+0.035772039 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:07 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:07 np0005604791 podman[77844]: 2026-02-02 09:39:07.380099728 +0000 UTC m=+0.154084372 container init 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Feb  2 04:39:07 np0005604791 podman[77844]: 2026-02-02 09:39:07.386122147 +0000 UTC m=+0.160106731 container start 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:39:07 np0005604791 podman[77844]: 2026-02-02 09:39:07.389310687 +0000 UTC m=+0.163295351 container attach 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: load: jerasure load: lrc 
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb  2 04:39:07 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:07 np0005604791 lvm[77942]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:39:07 np0005604791 lvm[77942]: VG ceph_vg0 finished
Feb  2 04:39:08 np0005604791 funny_blackwell[77863]: {}
Feb  2 04:39:08 np0005604791 systemd[1]: libpod-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Deactivated successfully.
Feb  2 04:39:08 np0005604791 systemd[1]: libpod-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Consumed 1.018s CPU time.
Feb  2 04:39:08 np0005604791 podman[77844]: 2026-02-02 09:39:08.08226692 +0000 UTC m=+0.856251534 container died 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default)
Feb  2 04:39:08 np0005604791 systemd[1]: var-lib-containers-storage-overlay-1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177-merged.mount: Deactivated successfully.
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:08 np0005604791 podman[77844]: 2026-02-02 09:39:08.132876755 +0000 UTC m=+0.906861379 container remove 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:08 np0005604791 systemd[1]: libpod-conmon-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Deactivated successfully.
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount shared_bdev_used = 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: RocksDB version: 7.9.2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Git sha 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Compile date 2025-07-17 03:12:14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DB SUMMARY
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DB Session ID:  URAREYQTFRBAH0CRPE4U
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: CURRENT file:  CURRENT
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: IDENTITY file:  IDENTITY
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.error_if_exists: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.create_if_missing: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.paranoid_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                     Options.env: 0x5616df813dc0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                Options.info_log: 0x5616df8177a0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_file_opening_threads: 16
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.statistics: (nil)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.use_fsync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.max_log_file_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.keep_log_file_num: 1000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.recycle_log_file_num: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.allow_fallocate: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.allow_mmap_reads: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.allow_mmap_writes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.use_direct_reads: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.create_missing_column_families: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.db_log_dir: 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                 Options.wal_dir: db.wal
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.table_cache_numshardbits: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.advise_random_on_open: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.db_write_buffer_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.write_buffer_manager: 0x5616df91ea00
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                            Options.rate_limiter: (nil)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.wal_recovery_mode: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.enable_thread_tracking: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.enable_pipelined_write: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.unordered_write: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.row_cache: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.wal_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.allow_ingest_behind: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.two_write_queues: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.manual_wal_flush: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.wal_compression: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.atomic_flush: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.log_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.best_efforts_recovery: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.allow_data_in_errors: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.db_host_id: __hostname__
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.enforce_single_del_contracts: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_background_jobs: 4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_background_compactions: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_subcompactions: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.delayed_write_rate : 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.max_open_files: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.max_background_flushes: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Compression algorithms supported:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZSTD supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kXpressCompression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kBZip2Compression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kLZ4Compression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZlibCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kLZ4HCCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kSnappyCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Fast CRC32 supported: Supported on x86
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DMutex implementation: pthread_mutex_t
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b15bed18-a53e-4ccb-bbb2-c9066cfaeff1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148509956, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148510262, "job": 1, "event": "recovery_finished"}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: freelist init
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: freelist _read_cfg
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs umount
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) close
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluefs mount shared_bdev_used = 4718592
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: RocksDB version: 7.9.2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Git sha 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Compile date 2025-07-17 03:12:14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DB SUMMARY
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DB Session ID:  URAREYQTFRBAH0CRPE4V
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: CURRENT file:  CURRENT
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: IDENTITY file:  IDENTITY
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.error_if_exists: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.create_if_missing: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.paranoid_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                     Options.env: 0x5616df9c2310
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                Options.info_log: 0x5616df817940
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_file_opening_threads: 16
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.statistics: (nil)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.use_fsync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.max_log_file_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.keep_log_file_num: 1000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.recycle_log_file_num: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.allow_fallocate: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.allow_mmap_reads: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.allow_mmap_writes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.use_direct_reads: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.create_missing_column_families: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.db_log_dir: 
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                                 Options.wal_dir: db.wal
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.table_cache_numshardbits: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.advise_random_on_open: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.db_write_buffer_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.write_buffer_manager: 0x5616df91ea00
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                            Options.rate_limiter: (nil)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.wal_recovery_mode: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.enable_thread_tracking: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.enable_pipelined_write: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.unordered_write: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.row_cache: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                              Options.wal_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.allow_ingest_behind: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.two_write_queues: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.manual_wal_flush: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.wal_compression: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.atomic_flush: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.log_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.best_efforts_recovery: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.allow_data_in_errors: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.db_host_id: __hostname__
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.enforce_single_del_contracts: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_background_jobs: 4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_background_compactions: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_subcompactions: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.delayed_write_rate : 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.max_open_files: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.max_background_flushes: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Compression algorithms supported:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZSTD supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kXpressCompression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kBZip2Compression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kLZ4Compression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kZlibCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kLZ4HCCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: #011kSnappyCompression supported: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Fast CRC32 supported: Supported on x86
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DMutex implementation: pthread_mutex_t
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3d350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5616dea3c9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b15bed18-a53e-4ccb-bbb2-c9066cfaeff1
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148775268, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148780566, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148787062, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148789913, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148791224, "job": 1, "event": "recovery_finished"}
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5616dfa14000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: DB pointer 0x5616df9d0000
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usag
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: _get_class not permitted to load lua
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: _get_class not permitted to load sdk
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 load_pgs
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 load_pgs opened 0 pgs
Feb  2 04:39:08 np0005604791 ceph-osd[77691]: osd.0 0 log_to_monitors true
Feb  2 04:39:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:08.813+0000 7fb40eadf740 -1 osd.0 0 log_to_monitors true
Feb  2 04:39:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb  2 04:39:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb  2 04:39:10 np0005604791 podman[78526]: 2026-02-02 09:39:10.284630501 +0000 UTC m=+0.072379870 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:39:10 np0005604791 podman[78526]: 2026-02-02 09:39:10.395582536 +0000 UTC m=+0.183331895 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 done with init, starting boot process
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 start_boot
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb  2 04:39:10 np0005604791 ceph-osd[77691]: osd.0 0  bench count 12288000 bsize 4 KiB
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.078422972 +0000 UTC m=+0.056109962 container create 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb  2 04:39:11 np0005604791 systemd[1]: Started libpod-conmon-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope.
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.0414428 +0000 UTC m=+0.019129780 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:11 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.168421897 +0000 UTC m=+0.146108887 container init 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.17704501 +0000 UTC m=+0.154731970 container start 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb  2 04:39:11 np0005604791 blissful_cannon[78682]: 167 167
Feb  2 04:39:11 np0005604791 systemd[1]: libpod-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope: Deactivated successfully.
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.186358363 +0000 UTC m=+0.164045323 container attach 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.186700172 +0000 UTC m=+0.164387162 container died 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Feb  2 04:39:11 np0005604791 systemd[1]: var-lib-containers-storage-overlay-9b61ab76e6f878e3f1c921f127571566dde26f12639941dbf85293ca7a6aecdd-merged.mount: Deactivated successfully.
Feb  2 04:39:11 np0005604791 podman[78666]: 2026-02-02 09:39:11.285260859 +0000 UTC m=+0.262947819 container remove 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Feb  2 04:39:11 np0005604791 systemd[1]: libpod-conmon-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope: Deactivated successfully.
Feb  2 04:39:11 np0005604791 podman[78705]: 2026-02-02 09:39:11.382724634 +0000 UTC m=+0.024364547 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:11 np0005604791 podman[78705]: 2026-02-02 09:39:11.507930062 +0000 UTC m=+0.149569955 container create 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Feb  2 04:39:11 np0005604791 systemd[1]: Started libpod-conmon-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope.
Feb  2 04:39:11 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:11 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:11 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:11 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:11 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:11 np0005604791 podman[78705]: 2026-02-02 09:39:11.818591613 +0000 UTC m=+0.460231536 container init 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Feb  2 04:39:11 np0005604791 podman[78705]: 2026-02-02 09:39:11.827602037 +0000 UTC m=+0.469241970 container start 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:39:12 np0005604791 podman[78705]: 2026-02-02 09:39:12.023404973 +0000 UTC m=+0.665044876 container attach 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True)
Feb  2 04:39:12 np0005604791 naughty_tu[78723]: [
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:    {
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "available": false,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "being_replaced": false,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "ceph_device_lvm": false,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "device_id": "QEMU_DVD-ROM_QM00001",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "lsm_data": {},
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "lvs": [],
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "path": "/dev/sr0",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "rejected_reasons": [
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "Insufficient space (<5GB)",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "Has a FileSystem"
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        ],
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        "sys_api": {
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "actuators": null,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "device_nodes": [
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:                "sr0"
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            ],
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "devname": "sr0",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "human_readable_size": "482.00 KB",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "id_bus": "ata",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "model": "QEMU DVD-ROM",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "nr_requests": "2",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "parent": "/dev/sr0",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "partitions": {},
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "path": "/dev/sr0",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "removable": "1",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "rev": "2.5+",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "ro": "0",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "rotational": "1",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "sas_address": "",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "sas_device_handle": "",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "scheduler_mode": "mq-deadline",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "sectors": 0,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "sectorsize": "2048",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "size": 493568.0,
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "support_discard": "2048",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "type": "disk",
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:            "vendor": "QEMU"
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:        }
Feb  2 04:39:12 np0005604791 naughty_tu[78723]:    }
Feb  2 04:39:12 np0005604791 naughty_tu[78723]: ]
Feb  2 04:39:12 np0005604791 systemd[1]: libpod-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope: Deactivated successfully.
Feb  2 04:39:12 np0005604791 podman[78705]: 2026-02-02 09:39:12.522470171 +0000 UTC m=+1.164110064 container died 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb  2 04:39:12 np0005604791 systemd[1]: var-lib-containers-storage-overlay-6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca-merged.mount: Deactivated successfully.
Feb  2 04:39:12 np0005604791 podman[78705]: 2026-02-02 09:39:12.610944563 +0000 UTC m=+1.252584486 container remove 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2)
Feb  2 04:39:12 np0005604791 systemd[1]: libpod-conmon-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope: Deactivated successfully.
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.526 iops: 4998.717 elapsed_sec: 0.600
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: log_channel(cluster) log [WRN] : OSD bench result of 4998.717013 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 0 waiting for initial osdmap
Feb  2 04:39:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:14.712+0000 7fb40aa62640 -1 osd.0 0 waiting for initial osdmap
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 crush map has features 288514050185494528, adjusting msgr requires for clients
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 crush map has features 3314932999778484224, adjusting msgr requires for osds
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 check_osdmap_features require_osd_release unknown -> squid
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb  2 04:39:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:14.743+0000 7fb40608a640 -1 osd.0 9 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 set_numa_affinity not setting numa affinity
Feb  2 04:39:14 np0005604791 ceph-osd[77691]: osd.0 9 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Feb  2 04:39:15 np0005604791 ceph-osd[77691]: osd.0 9 tick checking mon for new map
Feb  2 04:39:15 np0005604791 ceph-osd[77691]: osd.0 10 state: booting -> active
Feb  2 04:39:16 np0005604791 ceph-osd[77691]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb  2 04:39:16 np0005604791 ceph-osd[77691]: osd.0 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Feb  2 04:39:16 np0005604791 ceph-osd[77691]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb  2 04:39:33 np0005604791 podman[79883]: 2026-02-02 09:39:33.956260586 +0000 UTC m=+0.041663464 container create d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:39:33 np0005604791 systemd[1]: Started libpod-conmon-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope.
Feb  2 04:39:34 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:33.937396541 +0000 UTC m=+0.022799439 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:34.03367214 +0000 UTC m=+0.119075018 container init d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:34.040041384 +0000 UTC m=+0.125444272 container start d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:34.042837426 +0000 UTC m=+0.128240304 container attach d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb  2 04:39:34 np0005604791 wonderful_keller[79899]: 167 167
Feb  2 04:39:34 np0005604791 systemd[1]: libpod-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope: Deactivated successfully.
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:34.044946181 +0000 UTC m=+0.130349059 container died d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:34 np0005604791 systemd[1]: var-lib-containers-storage-overlay-e47b27560c9f1de2943fda8b54a351f70eb847a54cac1705d764496bc137908e-merged.mount: Deactivated successfully.
Feb  2 04:39:34 np0005604791 podman[79883]: 2026-02-02 09:39:34.083854023 +0000 UTC m=+0.169256901 container remove d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb  2 04:39:34 np0005604791 systemd[1]: libpod-conmon-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope: Deactivated successfully.
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.156773571 +0000 UTC m=+0.050490372 container create ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:39:34 np0005604791 systemd[1]: Started libpod-conmon-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope.
Feb  2 04:39:34 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:34 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:34 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:34 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:34 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.228649462 +0000 UTC m=+0.122366283 container init ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.138577242 +0000 UTC m=+0.032294043 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.238005333 +0000 UTC m=+0.131722134 container start ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.241546314 +0000 UTC m=+0.135263085 container attach ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb  2 04:39:34 np0005604791 systemd[1]: libpod-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope: Deactivated successfully.
Feb  2 04:39:34 np0005604791 conmon[79932]: conmon ddc2e2ce4b98f06f54b4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope/container/memory.events
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.33999999 +0000 UTC m=+0.233716761 container died ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Feb  2 04:39:34 np0005604791 systemd[1]: var-lib-containers-storage-overlay-d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef-merged.mount: Deactivated successfully.
Feb  2 04:39:34 np0005604791 podman[79917]: 2026-02-02 09:39:34.370359242 +0000 UTC m=+0.264076063 container remove ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:39:34 np0005604791 systemd[1]: libpod-conmon-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope: Deactivated successfully.
Feb  2 04:39:34 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:34 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:34 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:34 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:34 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:34 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:34 np0005604791 systemd[1]: Starting Ceph mon.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:39:35 np0005604791 podman[80095]: 2026-02-02 09:39:35.085332847 +0000 UTC m=+0.053927470 container create 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:39:35 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:35 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:35 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:35 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:35 np0005604791 podman[80095]: 2026-02-02 09:39:35.148476144 +0000 UTC m=+0.117070737 container init 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:39:35 np0005604791 podman[80095]: 2026-02-02 09:39:35.060795115 +0000 UTC m=+0.029389788 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:35 np0005604791 podman[80095]: 2026-02-02 09:39:35.155540156 +0000 UTC m=+0.124134749 container start 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Feb  2 04:39:35 np0005604791 bash[80095]: 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229
Feb  2 04:39:35 np0005604791 systemd[1]: Started Ceph mon.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: pidfile_write: ignore empty --pid-file
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: load: jerasure load: lrc 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: RocksDB version: 7.9.2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Git sha 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Compile date 2025-07-17 03:12:14
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: DB SUMMARY
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: DB Session ID:  DE871D21TSCUFP8UED8E
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: CURRENT file:  CURRENT
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: IDENTITY file:  IDENTITY
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                         Options.error_if_exists: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.create_if_missing: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                         Options.paranoid_checks: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                                     Options.env: 0x55a64c64ec20
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                                      Options.fs: PosixFileSystem
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                                Options.info_log: 0x55a64de99a20
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.max_file_opening_threads: 16
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                              Options.statistics: (nil)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                               Options.use_fsync: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.max_log_file_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.keep_log_file_num: 1000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                    Options.recycle_log_file_num: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                         Options.allow_fallocate: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.allow_mmap_reads: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.allow_mmap_writes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.use_direct_reads: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.create_missing_column_families: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                              Options.db_log_dir: 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                                 Options.wal_dir: 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.table_cache_numshardbits: 6
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.advise_random_on_open: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                    Options.db_write_buffer_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                    Options.write_buffer_manager: 0x55a64de9d900
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                            Options.rate_limiter: (nil)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.wal_recovery_mode: 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.enable_thread_tracking: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.enable_pipelined_write: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.unordered_write: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                               Options.row_cache: None
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                              Options.wal_filter: None
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.allow_ingest_behind: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.two_write_queues: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.manual_wal_flush: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.wal_compression: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.atomic_flush: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.log_readahead_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.best_efforts_recovery: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.allow_data_in_errors: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.db_host_id: __hostname__
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.enforce_single_del_contracts: true
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.max_background_jobs: 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.max_background_compactions: -1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.max_subcompactions: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.delayed_write_rate : 16777216
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.max_total_wal_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                          Options.max_open_files: -1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                          Options.bytes_per_sync: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:       Options.compaction_readahead_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.max_background_flushes: -1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Compression algorithms supported:
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kZSTD supported: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kXpressCompression supported: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kBZip2Compression supported: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kLZ4Compression supported: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kZlibCompression supported: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kLZ4HCCompression supported: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: #011kSnappyCompression supported: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Fast CRC32 supported: Supported on x86
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: DMutex implementation: pthread_mutex_t
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:           Options.merge_operator: 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:        Options.compaction_filter: None
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:        Options.compaction_filter_factory: None
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:  Options.sst_partitioner_factory: None
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            Options.table_factory: BlockBasedTable
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a64de985c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a64debd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:        Options.write_buffer_size: 33554432
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:  Options.max_write_buffer_number: 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.compression: NoCompression
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression: Disabled
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:       Options.prefix_extractor: nullptr
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.num_levels: 7
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:            Options.compression_opts.window_bits: -14
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.compression_opts.level: 32767
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:               Options.compression_opts.strategy: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                  Options.compression_opts.enabled: false
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.target_file_size_base: 67108864
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:             Options.target_file_size_multiplier: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.arena_block_size: 1048576
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.disable_auto_compactions: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.inplace_update_support: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:   Options.memtable_huge_page_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                           Options.bloom_locality: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                    Options.max_successive_merges: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.paranoid_file_checks: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.force_consistency_checks: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.report_bg_io_stats: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                               Options.ttl: 2592000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                       Options.enable_blob_files: false
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                           Options.min_blob_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                          Options.blob_file_size: 268435456
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb:                Options.blob_file_starting_level: 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fac1d709-8a2a-487d-b05b-57255ec289c7
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175200272, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175202204, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175202286, "job": 1, "event": "recovery_finished"}
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a64debee00
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: DB pointer 0x55a64dfc8000
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(???) e0 preinit fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).mds e1 new map
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012btime 2026-02-02T09:37:43:907997+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 3314933000852226048, adjusting msgr requires
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/908444544' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "273baa6d-671d-41d3-8896-5eac2274aa10"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/908444544' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "273baa6d-671d-41d3-8896-5eac2274aa10"}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/224206128' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fabfc705-a3af-416c-81a4-3fd4d777fb5f"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/224206128' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "fabfc705-a3af-416c-81a4-3fd4d777fb5f"}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Deploying daemon osd.0 on compute-1
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Deploying daemon osd.1 on compute-0
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Adjusting osd_memory_target on compute-1 to  5247M
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Adjusting osd_memory_target on compute-0 to 127.9M
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Unable to set osd_memory_target on compute-0 to 134197657: error parsing value: Value '134197657' is below minimum 939524096
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: OSD bench result of 4998.717013 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040] boot
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: OSD bench result of 7696.745182 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271] boot
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Deploying daemon mon.compute-2 on compute-2
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: Cluster is now healthy
Feb  2 04:39:35 np0005604791 ceph-mon[80115]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Feb  2 04:39:41 np0005604791 ceph-mon[80115]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Feb  2 04:39:41 np0005604791 ceph-mon[80115]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Feb  2 04:39:41 np0005604791 ceph-mon[80115]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Feb  2 04:39:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: Deploying daemon mon.compute-1 on compute-1
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-0 calling monitor election
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-2 calling monitor election
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: overall HEALTH_OK
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864292,os=Linux}
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-0 calling monitor election
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-2 calling monitor election
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1 calling monitor election
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: overall HEALTH_OK
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Feb  2 04:39:44 np0005604791 podman[80246]: 2026-02-02 09:39:44.990868407 +0000 UTC m=+0.047301099 container create 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:39:45 np0005604791 systemd[1]: Started libpod-conmon-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope.
Feb  2 04:39:45 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:44.968507232 +0000 UTC m=+0.024939914 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:45.069187974 +0000 UTC m=+0.125620656 container init 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:45.081375518 +0000 UTC m=+0.137808210 container start 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:45.084815837 +0000 UTC m=+0.141248529 container attach 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Feb  2 04:39:45 np0005604791 peaceful_buck[80262]: 167 167
Feb  2 04:39:45 np0005604791 systemd[1]: libpod-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope: Deactivated successfully.
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:45.088570664 +0000 UTC m=+0.145003326 container died 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb  2 04:39:45 np0005604791 systemd[1]: var-lib-containers-storage-overlay-3b4863e23ba2b1038f54ad1d06529049c7f62ba5d7b8112cb6f59a0ba41f85b9-merged.mount: Deactivated successfully.
Feb  2 04:39:45 np0005604791 podman[80246]: 2026-02-02 09:39:45.119441329 +0000 UTC m=+0.175874001 container remove 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Feb  2 04:39:45 np0005604791 systemd[1]: libpod-conmon-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope: Deactivated successfully.
Feb  2 04:39:45 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e13 _set_new_cache_sizes cache_size:1019937191 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:39:45 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:45 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:45 np0005604791 systemd[1]: Reloading.
Feb  2 04:39:45 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:45 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.teascl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb  2 04:39:45 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.teascl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb  2 04:39:45 np0005604791 ceph-mon[80115]: Deploying daemon mgr.compute-1.teascl on compute-1
Feb  2 04:39:45 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:39:45 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:39:45 np0005604791 systemd[1]: Starting Ceph mgr.compute-1.teascl for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:39:45 np0005604791 podman[80403]: 2026-02-02 09:39:45.847404488 +0000 UTC m=+0.037840455 container create 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb  2 04:39:45 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:45 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:45 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:45 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/lib/ceph/mgr/ceph-compute-1.teascl supports timestamps until 2038 (0x7fffffff)
Feb  2 04:39:45 np0005604791 podman[80403]: 2026-02-02 09:39:45.908884952 +0000 UTC m=+0.099320919 container init 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:39:45 np0005604791 podman[80403]: 2026-02-02 09:39:45.918389257 +0000 UTC m=+0.108825224 container start 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Feb  2 04:39:45 np0005604791 bash[80403]: 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994
Feb  2 04:39:45 np0005604791 podman[80403]: 2026-02-02 09:39:45.830644697 +0000 UTC m=+0.021080634 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:39:45 np0005604791 systemd[1]: Started Ceph mgr.compute-1.teascl for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:39:45 np0005604791 ceph-mgr[80422]: set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:39:45 np0005604791 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb  2 04:39:45 np0005604791 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb  2 04:39:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb  2 04:39:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.074+0000 7f2ea4a6c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb  2 04:39:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.146+0000 7f2ea4a6c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Feb  2 04:39:46 np0005604791 ceph-mon[80115]: Deploying daemon crash.compute-2 on compute-2
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb  2 04:39:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.801+0000 7f2ea4a6c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:39:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.339+0000 7f2ea4a6c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2428528003' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:  from numpy import show_config as show_numpy_config
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.484+0000 7f2ea4a6c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.545+0000 7f2ea4a6c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb  2 04:39:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.663+0000 7f2ea4a6c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:39:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2428528003' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4145763547' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb  2 04:39:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.544+0000 7f2ea4a6c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 15 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb  2 04:39:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.743+0000 7f2ea4a6c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb  2 04:39:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.813+0000 7f2ea4a6c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb  2 04:39:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.875+0000 7f2ea4a6c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:39:48 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb  2 04:39:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.949+0000 7f2ea4a6c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb  2 04:39:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.029+0000 7f2ea4a6c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb  2 04:39:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.333+0000 7f2ea4a6c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e16 e16: 3 total, 2 up, 3 in
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb  2 04:39:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.422+0000 7f2ea4a6c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 16 pg[4.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:39:49 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 16 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4145763547' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2619661592' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/1508149425' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]: dispatch
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]: dispatch
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2619661592' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:49 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]': finished
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:39:49 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb  2 04:39:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.831+0000 7f2ea4a6c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e16 _set_new_cache_sizes cache_size:1020053324 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.336+0000 7f2ea4a6c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.414+0000 7f2ea4a6c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e17 e17: 3 total, 2 up, 3 in
Feb  2 04:39:50 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 17 pg[5.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:39:50 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 17 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.494+0000 7f2ea4a6c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb  2 04:39:50 np0005604791 ceph-mon[80115]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb  2 04:39:50 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1316336526' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:50 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1316336526' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.625+0000 7f2ea4a6c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.688+0000 7f2ea4a6c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:39:50 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb  2 04:39:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.827+0000 7f2ea4a6c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb  2 04:39:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.020+0000 7f2ea4a6c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb  2 04:39:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.257+0000 7f2ea4a6c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.318+0000 7f2ea4a6c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:39:51 np0005604791 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55dc9e0b8d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb  2 04:39:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e18 e18: 3 total, 2 up, 3 in
Feb  2 04:39:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 18 pg[6.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:39:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 18 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:39:51 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2386791214' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:51 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2386791214' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Feb  2 04:39:52 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 19 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:39:52 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4009666663' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb  2 04:39:52 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:52 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:39:52 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4009666663' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb  2 04:39:53 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Feb  2 04:39:53 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1859598156' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Feb  2 04:39:53 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1859598156' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Feb  2 04:39:54 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Feb  2 04:39:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e21 _set_new_cache_sizes cache_size:1020054713 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:39:55 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/740467932' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Feb  2 04:39:55 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Feb  2 04:39:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Feb  2 04:39:56 np0005604791 ceph-mon[80115]: Deploying daemon osd.2 on compute-2
Feb  2 04:39:56 np0005604791 ceph-mon[80115]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb  2 04:39:56 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/740467932' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Feb  2 04:39:56 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1601645049' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Feb  2 04:39:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Feb  2 04:39:57 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1601645049' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Feb  2 04:39:58 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Feb  2 04:39:58 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2116931678' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Feb  2 04:39:58 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2116931678' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Feb  2 04:40:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e24 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4028203447' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Feb  2 04:40:01 np0005604791 ceph-mon[80115]:    application not enabled on pool 'images'
Feb  2 04:40:01 np0005604791 ceph-mon[80115]:    application not enabled on pool 'cephfs.cephfs.meta'
Feb  2 04:40:01 np0005604791 ceph-mon[80115]:    application not enabled on pool 'cephfs.cephfs.data'
Feb  2 04:40:01 np0005604791 ceph-mon[80115]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/4028203447' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:01 np0005604791 ceph-mon[80115]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb  2 04:40:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Feb  2 04:40:02 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1055840720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Feb  2 04:40:03 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1055840720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Feb  2 04:40:03 np0005604791 ceph-mon[80115]: from='osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Feb  2 04:40:03 np0005604791 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Feb  2 04:40:03 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:03 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: Cluster is now healthy
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:04 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Feb  2 04:40:04 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=28 pruub=10.911078453s) [] r=-1 lpr=28 pi=[17,28)/1 crt=0'0 mlcod 0'0 active pruub 66.645973206s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:04 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=8.888126373s) [] r=-1 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:04 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=28 pruub=10.911078453s) [] r=-1 lpr=28 pi=[17,28)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:04 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=8.888126373s) [] r=-1 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:04 np0005604791 podman[80602]: 2026-02-02 09:40:04.767236964 +0000 UTC m=+0.077869896 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Feb  2 04:40:04 np0005604791 podman[80602]: 2026-02-02 09:40:04.896600326 +0000 UTC m=+0.207233278 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e28 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e29 e29: 3 total, 2 up, 3 in
Feb  2 04:40:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=29 pruub=7.829770565s) [] r=-1 lpr=29 pi=[15,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [] -> [], acting [] -> [], acting_primary ? -> -1, up_primary ? -> -1, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=29 pruub=7.829770565s) [] r=-1 lpr=29 pi=[15,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2425208278' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2425208278' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Feb  2 04:40:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e30 e30: 3 total, 2 up, 3 in
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1005529416' entity='client.admin' 
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: OSD bench result of 6231.058141 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: Adjusting osd_memory_target on compute-2 to 127.9M
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: Unable to set osd_memory_target on compute-2 to 134203392: error parsing value: Value '134203392' is below minimum 939524096
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb  2 04:40:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e31 e31: 3 total, 3 up, 3 in
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.19( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.19( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.17( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.17( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.18( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.18( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.12( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.12( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=5.751711845s) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.7( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=5.751691341s) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=31 pruub=7.774549484s) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.7( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31 pruub=14.765381813s) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active pruub 73.636924744s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.6( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.6( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.2( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.2( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.4( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.8( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.8( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.4( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=31 pruub=7.771995544s) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31 pruub=14.765381813s) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown pruub 73.636924744s@ mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e32 e32: 3 total, 3 up, 3 in
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1e( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1f( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1e( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.10( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.11( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.13( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.11( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.10( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.12( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.15( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.12( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.14( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.14( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.17( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.16( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.16( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.17( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.9( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.8( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.b( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.a( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.b( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.d( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.6( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.7( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.c( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.3( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.2( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.6( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.4( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.7( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.5( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.4( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.2( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.e( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.f( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.3( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.f( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1c( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1d( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1d( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1c( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1a( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1b( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.18( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.19( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.19( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.16( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.0( empty local-lis/les=31/32 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308] boot
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: Saving service ingress.rgw.default spec with placement count:2
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:08 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:40:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Feb  2 04:40:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Feb  2 04:40:10 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Feb  2 04:40:10 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Feb  2 04:40:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Feb  2 04:40:10 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:40:10 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:10 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: Saving service node-exporter spec with placement *
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: Saving service grafana spec with placement compute-0;count:1
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: Saving service prometheus spec with placement compute-0;count:1
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: Saving service alertmanager spec with placement compute-0;count:1
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 33 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=13.375972748s) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active pruub 75.653724670s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=13.375972748s) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown pruub 75.653724670s@ mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.19( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.17( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.12( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.9( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.8( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.6( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.18( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.2( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.4( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.7( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.11( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.3( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.16( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.13( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.14( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.5( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.15( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.10( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Feb  2 04:40:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2304504428' entity='client.admin' 
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.18( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1f( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.6( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.4( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.0( empty local-lis/les=33/35 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.f( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.16( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.11( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.14( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.10( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.13( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.9( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Feb  2 04:40:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3997762270' entity='client.admin' 
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.zjyufj", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.zjyufj", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:13 np0005604791 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-2.zjyufj on compute-2
Feb  2 04:40:14 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Feb  2 04:40:14 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Feb  2 04:40:14 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3629224831' entity='client.admin' 
Feb  2 04:40:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Feb  2 04:40:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.ezjvcf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.ezjvcf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-1.ezjvcf on compute-1
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.222783942 +0000 UTC m=+0.038046251 container create 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Feb  2 04:40:15 np0005604791 systemd[1]: Started libpod-conmon-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope.
Feb  2 04:40:15 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.295970467 +0000 UTC m=+0.111232846 container init 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.203324361 +0000 UTC m=+0.018586690 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.303783648 +0000 UTC m=+0.119045977 container start 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:40:15 np0005604791 nostalgic_driscoll[81325]: 167 167
Feb  2 04:40:15 np0005604791 systemd[1]: libpod-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope: Deactivated successfully.
Feb  2 04:40:15 np0005604791 conmon[81325]: conmon 6a4f746bf2197116f1a1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope/container/memory.events
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.313337744 +0000 UTC m=+0.128600083 container attach 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True)
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.313925379 +0000 UTC m=+0.129187718 container died 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Feb  2 04:40:15 np0005604791 systemd[1]: var-lib-containers-storage-overlay-0937e87ad384aee4e628dbd5763f25cc2a03f2655138eca49aad3566aef90868-merged.mount: Deactivated successfully.
Feb  2 04:40:15 np0005604791 podman[81309]: 2026-02-02 09:40:15.36092028 +0000 UTC m=+0.176182619 container remove 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:40:15 np0005604791 systemd[1]: libpod-conmon-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope: Deactivated successfully.
Feb  2 04:40:15 np0005604791 systemd[1]: Reloading.
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:15 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:40:15 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:40:15 np0005604791 systemd[1]: Reloading.
Feb  2 04:40:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Feb  2 04:40:15 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 36 pg[8.0( empty local-lis/les=0/0 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [0] r=0 lpr=36 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:15 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:40:15 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:40:15 np0005604791 python3[81404]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:40:15 np0005604791 systemd[1]: Starting Ceph rgw.rgw.compute-1.ezjvcf for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:40:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Feb  2 04:40:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Feb  2 04:40:16 np0005604791 podman[81508]: 2026-02-02 09:40:16.200448303 +0000 UTC m=+0.052555215 container create e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:40:16 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:40:16 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:40:16 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:40:16 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.ezjvcf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:40:16 np0005604791 podman[81508]: 2026-02-02 09:40:16.179870453 +0000 UTC m=+0.031977355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:40:16 np0005604791 podman[81508]: 2026-02-02 09:40:16.280105495 +0000 UTC m=+0.132212447 container init e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:40:16 np0005604791 podman[81508]: 2026-02-02 09:40:16.286855759 +0000 UTC m=+0.138962671 container start e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:40:16 np0005604791 bash[81508]: e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd
Feb  2 04:40:16 np0005604791 systemd[1]: Started Ceph rgw.rgw.compute-1.ezjvcf for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:40:16 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3477511090' entity='client.admin' 
Feb  2 04:40:16 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/343742408' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Feb  2 04:40:16 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Feb  2 04:40:16 np0005604791 radosgw[81528]: deferred set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:40:16 np0005604791 radosgw[81528]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Feb  2 04:40:16 np0005604791 radosgw[81528]: framework: beast
Feb  2 04:40:16 np0005604791 radosgw[81528]: framework conf key: endpoint, val: 192.168.122.101:8082
Feb  2 04:40:16 np0005604791 radosgw[81528]: init_numa not setting numa affinity
Feb  2 04:40:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Feb  2 04:40:16 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 37 pg[8.0( empty local-lis/les=36/37 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [0] r=0 lpr=36 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:17 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Feb  2 04:40:17 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vltabo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vltabo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-0.vltabo on compute-0
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1069846288' entity='client.admin' 
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Feb  2 04:40:17 np0005604791 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb  2 04:40:18 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts
Feb  2 04:40:18 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 38 pg[9.0( empty local-lis/les=0/0 n=0 ec=38/38 lis/c=0/0 les/c/f=0/0/0 sis=38) [0] r=0 lpr=38 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:18 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:18 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Feb  2 04:40:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Feb  2 04:40:19 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 39 pg[9.0( empty local-lis/les=38/39 n=0 ec=38/38 lis/c=0/0 les/c/f=0/0/0 sis=38) [0] r=0 lpr=38 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: Deploying daemon haproxy.rgw.default.compute-0.avekxu on compute-0
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3639574610' entity='client.admin' 
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Feb  2 04:40:19 np0005604791 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.b scrub starts
Feb  2 04:40:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.b scrub ok
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2604604119' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Feb  2 04:40:20 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Feb  2 04:40:20 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Feb  2 04:40:21 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2604604119' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Feb  2 04:40:21 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb  2 04:40:21 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb  2 04:40:21 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb  2 04:40:21 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Feb  2 04:40:21 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Feb  2 04:40:22 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 42 pg[11.0( empty local-lis/les=0/0 n=0 ec=42/42 lis/c=0/0 les/c/f=0/0/0 sis=42) [0] r=0 lpr=42 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1965440456' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:22 np0005604791 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:22 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.15 deep-scrub starts
Feb  2 04:40:22 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.15 deep-scrub ok
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 43 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=0/0 les/c/f=0/0/0 sis=42) [0] r=0 lpr=42 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  e: '/usr/bin/ceph-mgr'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  0: '/usr/bin/ceph-mgr'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  1: '-n'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  2: 'mgr.compute-1.teascl'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  3: '-f'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  4: '--setuser'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  5: 'ceph'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  6: '--setgroup'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  7: 'ceph'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  8: '--default-log-to-file=false'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  9: '--default-log-to-journald=true'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr respawn  10: '--default-log-to-stderr=false'
Feb  2 04:40:23 np0005604791 systemd[1]: session-28.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-31.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-30.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-32.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-32.scope: Consumed 55.279s CPU time.
Feb  2 04:40:23 np0005604791 systemd[1]: session-20.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-27.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 28 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd[1]: session-23.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 31 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 20 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 32 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 23 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 27 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 30 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd[1]: session-25.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-29.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd[1]: session-24.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 25 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd[1]: session-26.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 29 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd[1]: session-22.scope: Deactivated successfully.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 26 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 24 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Session 22 logged out. Waiting for processes to exit.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 28.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 31.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 30.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 32.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 20.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 27.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 23.
Feb  2 04:40:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb  2 04:40:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 25.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 29.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 24.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 26.
Feb  2 04:40:23 np0005604791 systemd-logind[805]: Removed session 22.
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb  2 04:40:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:23.293+0000 7f6a2fa24140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:23.366+0000 7f6a2fa24140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:23 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb  2 04:40:23 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1965440456' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Feb  2 04:40:23 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Feb  2 04:40:23 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb  2 04:40:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.091+0000 7f6a2fa24140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb  2 04:40:24 np0005604791 radosgw[81528]: v1 topic migration: starting v1 topic migration..
Feb  2 04:40:24 np0005604791 radosgw[81528]: LDAP not started since no server URIs were provided in the configuration.
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf[81524]: 2026-02-02T09:40:24.539+0000 7fc5d4838980 -1 LDAP not started since no server URIs were provided in the configuration.
Feb  2 04:40:24 np0005604791 radosgw[81528]: v1 topic migration: finished v1 topic migration
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: framework: beast
Feb  2 04:40:24 np0005604791 radosgw[81528]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Feb  2 04:40:24 np0005604791 radosgw[81528]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Feb  2 04:40:24 np0005604791 radosgw[81528]: starting handler: beast
Feb  2 04:40:24 np0005604791 radosgw[81528]: set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:40:24 np0005604791 radosgw[81528]: mgrc service_daemon_register rgw.24170 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.ezjvcf,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864292,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d5604b0e-c827-4596-94de-7709c44354e7,zone_name=default,zonegroup_id=d74d963d-58da-4c60-ad13-18a6b0033c09,zonegroup_name=default}
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.676+0000 7f6a2fa24140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb  2 04:40:24 np0005604791 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb  2 04:40:24 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:  from numpy import show_config as show_numpy_config
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.820+0000 7f6a2fa24140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Feb  2 04:40:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:24.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb  2 04:40:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.884+0000 7f6a2fa24140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:24 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb  2 04:40:24 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.a deep-scrub starts
Feb  2 04:40:24 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.a deep-scrub ok
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb  2 04:40:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:25.006+0000 7f6a2fa24140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb  2 04:40:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:25 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb  2 04:40:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:25.875+0000 7f6a2fa24140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:25 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.d scrub starts
Feb  2 04:40:25 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.d scrub ok
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.078+0000 7f6a2fa24140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.152+0000 7f6a2fa24140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.216+0000 7f6a2fa24140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.286+0000 7f6a2fa24140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.350+0000 7f6a2fa24140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.670+0000 7f6a2fa24140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb  2 04:40:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.758+0000 7f6a2fa24140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:26.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:26.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:26 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Feb  2 04:40:26 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Feb  2 04:40:26 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.137+0000 7f6a2fa24140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.607+0000 7f6a2fa24140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.669+0000 7f6a2fa24140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.738+0000 7f6a2fa24140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.864+0000 7f6a2fa24140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Feb  2 04:40:27 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.925+0000 7f6a2fa24140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:27 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb  2 04:40:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.056+0000 7f6a2fa24140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb  2 04:40:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.250+0000 7f6a2fa24140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.494+0000 7f6a2fa24140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.557+0000 7f6a2fa24140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55d9b1b59860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb  2 04:40:28 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb  2 04:40:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:28.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:28.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:28 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.f scrub starts
Feb  2 04:40:28 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.f scrub ok
Feb  2 04:40:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Feb  2 04:40:29 np0005604791 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb  2 04:40:29 np0005604791 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb  2 04:40:29 np0005604791 systemd-logind[805]: New session 33 of user ceph-admin.
Feb  2 04:40:29 np0005604791 systemd[1]: Started Session 33 of User ceph-admin.
Feb  2 04:40:29 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Feb  2 04:40:29 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Feb  2 04:40:30 np0005604791 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb  2 04:40:30 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb  2 04:40:30 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb  2 04:40:30 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:30 np0005604791 podman[82319]: 2026-02-02 09:40:30.124516565 +0000 UTC m=+0.071227215 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb  2 04:40:30 np0005604791 podman[82319]: 2026-02-02 09:40:30.230742071 +0000 UTC m=+0.177452741 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:40:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:30.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:30 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Feb  2 04:40:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:30 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Feb  2 04:40:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:30.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Bus STARTING
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: Cluster is now healthy
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:40:31 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Feb  2 04:40:31 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Feb  2 04:40:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.19( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.10( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.17( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.b( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.a( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.b( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.c( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.6( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.2( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.4( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.9( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.1e( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.118619919s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.339263916s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752305984s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972938538s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.118590355s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.339263916s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752251625s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972938538s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752285004s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.973014832s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752253532s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.973014832s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128526688s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349411011s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752019882s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751934052s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972869873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128501892s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349411011s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751987457s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751916885s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972869873s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128319740s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349395752s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128303528s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349395752s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751770973s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972930908s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751785278s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972946167s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751747131s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972930908s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751760483s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972946167s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128523827s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349822998s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128508568s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349822998s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751496315s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751472473s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128314972s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349754333s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128300667s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349754333s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751129150s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972656250s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127918243s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349494934s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751101494s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972656250s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127898216s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349494934s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751144409s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751119614s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750867844s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972656250s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128810883s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349357605s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750843048s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972656250s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127510071s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349357605s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750620842s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972595215s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127817154s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349807739s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750597954s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972595215s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127795219s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349807739s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127767563s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349815369s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127752304s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349815369s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750327110s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972518921s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127785683s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349975586s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750284195s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972518921s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750309944s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972595215s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127717972s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349975586s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750294685s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972595215s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127771378s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350097656s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127698898s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350097656s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750201225s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972648621s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750186920s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972648621s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127700806s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350204468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749849319s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972381592s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127601624s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350158691s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127652168s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350204468s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749823570s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972381592s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127588272s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350158691s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749593735s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972244263s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749562263s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972244263s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127418518s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350196838s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127482414s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350265503s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127470016s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350265503s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749266624s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972091675s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127391815s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350196838s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749685287s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972518921s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749248505s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972091675s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749658585s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972518921s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749163628s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972068787s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749152184s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972068787s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127462387s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350486755s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748907089s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.971946716s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748896599s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.971946716s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127389908s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350494385s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127447128s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350486755s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127364159s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350494385s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748805046s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.973007202s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748677254s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.973007202s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Serving on https://192.168.122.100:7150
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Client ('192.168.122.100', 53184) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Serving on http://192.168.122.100:8765
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Bus STARTED
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:40:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:32.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:32.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Feb  2 04:40:32 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Feb  2 04:40:33 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1e( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.1e( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.4( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.2( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.6( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.a( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.e( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.b( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.17( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.b( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.10( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.19( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:40:33 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb  2 04:40:33 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb  2 04:40:33 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb  2 04:40:33 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Feb  2 04:40:33 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:34.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:34.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:34 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.c scrub starts
Feb  2 04:40:34 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.c scrub ok
Feb  2 04:40:35 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:35 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:35 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:35 np0005604791 ceph-mon[80115]: Deploying daemon node-exporter.compute-0 on compute-0
Feb  2 04:40:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:35 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Feb  2 04:40:35 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Feb  2 04:40:36 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3702593450' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb  2 04:40:36 np0005604791 systemd[1]: session-33.scope: Deactivated successfully.
Feb  2 04:40:36 np0005604791 systemd[1]: session-33.scope: Consumed 4.051s CPU time.
Feb  2 04:40:36 np0005604791 systemd-logind[805]: Session 33 logged out. Waiting for processes to exit.
Feb  2 04:40:36 np0005604791 systemd-logind[805]: Removed session 33.
Feb  2 04:40:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb  2 04:40:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb  2 04:40:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:36.449+0000 7f820d21d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:36 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb  2 04:40:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:36.531+0000 7f820d21d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:36.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:36.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:36 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Feb  2 04:40:36 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Feb  2 04:40:37 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/3702593450' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb  2 04:40:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:37.347+0000 7f820d21d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb  2 04:40:37 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.0 deep-scrub starts
Feb  2 04:40:37 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.0 deep-scrub ok
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:37 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb  2 04:40:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:37.979+0000 7f820d21d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:  from numpy import show_config as show_numpy_config
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.138+0000 7f820d21d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.211+0000 7f820d21d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2174886532' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb  2 04:40:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.370+0000 7f820d21d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb  2 04:40:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:38.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:38 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.f scrub starts
Feb  2 04:40:38 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb  2 04:40:38 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.f scrub ok
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.241+0000 7f820d21d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2174886532' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.435+0000 7f820d21d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.502+0000 7f820d21d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.560+0000 7f820d21d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.629+0000 7f820d21d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.693+0000 7f820d21d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Feb  2 04:40:39 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:39 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb  2 04:40:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.991+0000 7f820d21d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb  2 04:40:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.077+0000 7f820d21d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb  2 04:40:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb  2 04:40:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.455+0000 7f820d21d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:40.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:40.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.931+0000 7f820d21d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb  2 04:40:40 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.b scrub starts
Feb  2 04:40:40 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.b scrub ok
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:40 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb  2 04:40:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.997+0000 7f820d21d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.068+0000 7f820d21d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.202+0000 7f820d21d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.265+0000 7f820d21d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.414+0000 7f820d21d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.615+0000 7f820d21d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.865+0000 7f820d21d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Feb  2 04:40:41 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.929+0000 7f820d21d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x56375e79f860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  e: '/usr/bin/ceph-mgr'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  0: '/usr/bin/ceph-mgr'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  1: '-n'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  2: 'mgr.compute-1.teascl'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  3: '-f'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  4: '--setuser'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  5: 'ceph'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  6: '--setgroup'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  7: 'ceph'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  8: '--default-log-to-file=false'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  9: '--default-log-to-journald=true'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  10: '--default-log-to-stderr=false'
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: mgr respawn  exe_path /proc/self/exe
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb  2 04:40:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb  2 04:40:41 np0005604791 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb  2 04:40:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb  2 04:40:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.127+0000 7f947fe91140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb  2 04:40:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.204+0000 7f947fe91140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb  2 04:40:42 np0005604791 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb  2 04:40:42 np0005604791 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb  2 04:40:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb  2 04:40:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:42.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:42 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Feb  2 04:40:42 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Feb  2 04:40:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.969+0000 7f947fe91140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:40:42 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.527+0000 7f947fe91140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:  from numpy import show_config as show_numpy_config
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.665+0000 7f947fe91140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.727+0000 7f947fe91140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb  2 04:40:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.849+0000 7f947fe91140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:40:43 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb  2 04:40:43 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.11 deep-scrub starts
Feb  2 04:40:43 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.11 deep-scrub ok
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb  2 04:40:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:44 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Feb  2 04:40:44 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Feb  2 04:40:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:44.893+0000 7f947fe91140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:40:44 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb  2 04:40:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:44.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.122+0000 7f947fe91140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.200+0000 7f947fe91140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.269+0000 7f947fe91140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.354+0000 7f947fe91140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.427+0000 7f947fe91140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb  2 04:40:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.757+0000 7f947fe91140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb  2 04:40:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.854+0000 7f947fe91140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:40:45 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb  2 04:40:45 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Feb  2 04:40:45 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb  2 04:40:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.295+0000 7f947fe91140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb  2 04:40:46 np0005604791 systemd[1]: Stopping User Manager for UID 42477...
Feb  2 04:40:46 np0005604791 systemd[72655]: Activating special unit Exit the Session...
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped target Main User Target.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped target Basic System.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped target Paths.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped target Sockets.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped target Timers.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped Daily Cleanup of User's Temporary Directories.
Feb  2 04:40:46 np0005604791 systemd[72655]: Closed D-Bus User Message Bus Socket.
Feb  2 04:40:46 np0005604791 systemd[72655]: Stopped Create User's Volatile Files and Directories.
Feb  2 04:40:46 np0005604791 systemd[72655]: Removed slice User Application Slice.
Feb  2 04:40:46 np0005604791 systemd[72655]: Reached target Shutdown.
Feb  2 04:40:46 np0005604791 systemd[72655]: Finished Exit the Session.
Feb  2 04:40:46 np0005604791 systemd[72655]: Reached target Exit the Session.
Feb  2 04:40:46 np0005604791 systemd[1]: user@42477.service: Deactivated successfully.
Feb  2 04:40:46 np0005604791 systemd[1]: Stopped User Manager for UID 42477.
Feb  2 04:40:46 np0005604791 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Feb  2 04:40:46 np0005604791 systemd[1]: run-user-42477.mount: Deactivated successfully.
Feb  2 04:40:46 np0005604791 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Feb  2 04:40:46 np0005604791 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Feb  2 04:40:46 np0005604791 systemd[1]: Removed slice User Slice of UID 42477.
Feb  2 04:40:46 np0005604791 systemd[1]: user-42477.slice: Consumed 1min 531ms CPU time.
Feb  2 04:40:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:46.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.878+0000 7f947fe91140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb  2 04:40:46 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Feb  2 04:40:46 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Feb  2 04:40:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:46.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.964+0000 7f947fe91140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:40:46 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.045+0000 7f947fe91140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.187+0000 7f947fe91140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.254+0000 7f947fe91140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.405+0000 7f947fe91140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb  2 04:40:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.641+0000 7f947fe91140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.917+0000 7f947fe91140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb  2 04:40:47 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Feb  2 04:40:47 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Feb  2 04:40:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.991+0000 7f947fe91140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55ee1e8816c0 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb  2 04:40:47 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb  2 04:40:48 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb  2 04:40:48 np0005604791 systemd[1]: Created slice User Slice of UID 42477.
Feb  2 04:40:48 np0005604791 systemd[1]: Starting User Runtime Directory /run/user/42477...
Feb  2 04:40:48 np0005604791 systemd-logind[805]: New session 34 of user ceph-admin.
Feb  2 04:40:48 np0005604791 systemd[1]: Finished User Runtime Directory /run/user/42477.
Feb  2 04:40:48 np0005604791 systemd[1]: Starting User Manager for UID 42477...
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb  2 04:40:48 np0005604791 systemd[83549]: Queued start job for default target Main User Target.
Feb  2 04:40:48 np0005604791 systemd[83549]: Created slice User Application Slice.
Feb  2 04:40:48 np0005604791 systemd[83549]: Started Mark boot as successful after the user session has run 2 minutes.
Feb  2 04:40:48 np0005604791 systemd[83549]: Started Daily Cleanup of User's Temporary Directories.
Feb  2 04:40:48 np0005604791 systemd[83549]: Reached target Paths.
Feb  2 04:40:48 np0005604791 systemd[83549]: Reached target Timers.
Feb  2 04:40:48 np0005604791 systemd[83549]: Starting D-Bus User Message Bus Socket...
Feb  2 04:40:48 np0005604791 systemd[83549]: Starting Create User's Volatile Files and Directories...
Feb  2 04:40:48 np0005604791 systemd[83549]: Listening on D-Bus User Message Bus Socket.
Feb  2 04:40:48 np0005604791 systemd[83549]: Reached target Sockets.
Feb  2 04:40:48 np0005604791 systemd[83549]: Finished Create User's Volatile Files and Directories.
Feb  2 04:40:48 np0005604791 systemd[83549]: Reached target Basic System.
Feb  2 04:40:48 np0005604791 systemd[83549]: Reached target Main User Target.
Feb  2 04:40:48 np0005604791 systemd[83549]: Startup finished in 119ms.
Feb  2 04:40:48 np0005604791 systemd[1]: Started User Manager for UID 42477.
Feb  2 04:40:48 np0005604791 systemd[1]: Started Session 34 of User ceph-admin.
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e2 new map
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e2 print_map#012e2#012btime 2026-02-02T09:40:48:656641+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:40:48.656583+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Feb  2 04:40:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Feb  2 04:40:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:48.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:48.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:48 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Feb  2 04:40:48 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Feb  2 04:40:49 np0005604791 podman[83687]: 2026-02-02 09:40:49.193362615 +0000 UTC m=+0.073392480 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb  2 04:40:49 np0005604791 podman[83687]: 2026-02-02 09:40:49.287017481 +0000 UTC m=+0.167047326 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:49 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Feb  2 04:40:49 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Bus STARTING
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Serving on https://192.168.122.100:7150
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Client ('192.168.122.100', 55968) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Serving on http://192.168.122.100:8765
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Bus STARTED
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:50.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:50 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Feb  2 04:40:50 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Feb  2 04:40:51 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Feb  2 04:40:51 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Feb  2 04:40:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:52.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:52.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:52 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Feb  2 04:40:52 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:53 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Feb  2 04:40:53 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Feb  2 04:40:53 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Feb  2 04:40:54 np0005604791 systemd[1]: Reloading.
Feb  2 04:40:54 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:40:54 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:40:54 np0005604791 systemd[1]: Reloading.
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:54 np0005604791 ceph-mon[80115]: Deploying daemon node-exporter.compute-1 on compute-1
Feb  2 04:40:54 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:40:54 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:40:54 np0005604791 systemd[1]: Starting Ceph node-exporter.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:40:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:54.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:54.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:54 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Feb  2 04:40:54 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Feb  2 04:40:54 np0005604791 bash[85049]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Feb  2 04:40:55 np0005604791 bash[85049]: Getting image source signatures
Feb  2 04:40:55 np0005604791 bash[85049]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Feb  2 04:40:55 np0005604791 bash[85049]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Feb  2 04:40:55 np0005604791 bash[85049]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Feb  2 04:40:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:40:55 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1616834281' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Feb  2 04:40:55 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/1616834281' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Feb  2 04:40:55 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.c scrub starts
Feb  2 04:40:55 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.c scrub ok
Feb  2 04:40:55 np0005604791 bash[85049]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Feb  2 04:40:55 np0005604791 bash[85049]: Writing manifest to image destination
Feb  2 04:40:56 np0005604791 podman[85049]: 2026-02-02 09:40:56.009607708 +0000 UTC m=+1.058240999 container create 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:40:56 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abf7e2a0b411b65b99726b953b1d4146d3ad8b02be300aae3a359d70a1365d66/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Feb  2 04:40:56 np0005604791 podman[85049]: 2026-02-02 09:40:56.057557615 +0000 UTC m=+1.106190926 container init 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:40:56 np0005604791 podman[85049]: 2026-02-02 09:40:56.061123788 +0000 UTC m=+1.109757069 container start 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:40:56 np0005604791 bash[85049]: 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07
Feb  2 04:40:56 np0005604791 podman[85049]: 2026-02-02 09:40:55.9966255 +0000 UTC m=+1.045258811 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.067Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.067Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.068Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=arp
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=bcache
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=bonding
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=btrfs
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=conntrack
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=cpu
Feb  2 04:40:56 np0005604791 systemd[1]: Started Ceph node-exporter.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=diskstats
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=dmi
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=edac
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=entropy
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=filefd
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=filesystem
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=hwmon
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=infiniband
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=ipvs
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=loadavg
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=mdadm
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=meminfo
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netclass
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netdev
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netstat
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nfs
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nfsd
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nvme
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=os
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=pressure
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=rapl
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=schedstat
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=selinux
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=sockstat
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=softnet
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=stat
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=tapestats
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=textfile
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=thermal_zone
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=time
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=uname
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=vmstat
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=xfs
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=zfs
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.075Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Feb  2 04:40:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.075Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Feb  2 04:40:56 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:56 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:56 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:56.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:56.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:56 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.a scrub starts
Feb  2 04:40:56 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.a scrub ok
Feb  2 04:40:57 np0005604791 ceph-mon[80115]: Deploying daemon node-exporter.compute-2 on compute-2
Feb  2 04:40:57 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.17 deep-scrub starts
Feb  2 04:40:57 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.17 deep-scrub ok
Feb  2 04:40:58 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:58 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:58 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:58 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:40:58 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:40:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:40:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:58.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:40:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:40:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:40:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:58.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:40:58 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Feb  2 04:40:58 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Feb  2 04:40:59 np0005604791 ceph-mon[80115]: from='client.? 192.168.122.100:0/2118971521' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Feb  2 04:40:59 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Feb  2 04:40:59 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Feb  2 04:41:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:00.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:00.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:00 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Feb  2 04:41:00 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Feb  2 04:41:01 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Feb  2 04:41:01 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Feb  2 04:41:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:02.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:02 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Feb  2 04:41:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:02.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:02 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Feb  2 04:41:03 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Feb  2 04:41:03 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vvohrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vvohrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb  2 04:41:03 np0005604791 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-2.vvohrf on compute-2
Feb  2 04:41:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:04.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:04 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Feb  2 04:41:04 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.clmmzw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.clmmzw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e3 new map
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e3 print_map#012e3#012btime 2026-02-02T09:41:05:061446+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:40:48.656583+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.vvohrf{-1:24310} state up:standby seq 1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e4 new map
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e4 print_map#012e4#012btime 2026-02-02T09:41:05:094248+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:05.094239+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.vvohrf{0:24310} state up:creating seq 1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Feb  2 04:41:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:05 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Feb  2 04:41:05 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-0.clmmzw on compute-0
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: daemon mds.cephfs.compute-2.vvohrf assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: Cluster is now healthy
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: daemon mds.cephfs.compute-2.vvohrf is now active in filesystem cephfs as rank 0
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e5 new map
Feb  2 04:41:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e5 print_map#012e5#012btime 2026-02-02T09:41:06:101701+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:06.101697+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Feb  2 04:41:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:41:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:06.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:41:06 np0005604791 podman[85224]: 2026-02-02 09:41:06.924940082 +0000 UTC m=+0.034776926 container create 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:06.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:06 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.e scrub starts
Feb  2 04:41:06 np0005604791 systemd[1]: Started libpod-conmon-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope.
Feb  2 04:41:06 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.e scrub ok
Feb  2 04:41:06 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:06.90989973 +0000 UTC m=+0.019736594 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:07.008730991 +0000 UTC m=+0.118567855 container init 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:07.013794293 +0000 UTC m=+0.123631137 container start 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:07.017166231 +0000 UTC m=+0.127003105 container attach 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:41:07 np0005604791 elegant_solomon[85241]: 167 167
Feb  2 04:41:07 np0005604791 systemd[1]: libpod-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope: Deactivated successfully.
Feb  2 04:41:07 np0005604791 conmon[85241]: conmon 4ed90f2d39cf69139d83 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope/container/memory.events
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:07.02060858 +0000 UTC m=+0.130445444 container died 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:41:07 np0005604791 systemd[1]: var-lib-containers-storage-overlay-cafcddcc12274bd2a400d470b95bc34e4047f4c09913d01a84e09918c0640db6-merged.mount: Deactivated successfully.
Feb  2 04:41:07 np0005604791 podman[85224]: 2026-02-02 09:41:07.052358446 +0000 UTC m=+0.162195290 container remove 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:41:07 np0005604791 systemd[1]: libpod-conmon-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope: Deactivated successfully.
Feb  2 04:41:07 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.khfsen", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.khfsen", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-1.khfsen on compute-1
Feb  2 04:41:07 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:07 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e6 new map
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e6 print_map#012e6#012btime 2026-02-02T09:41:07:200268+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:06.101697+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e7 new map
Feb  2 04:41:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e7 print_map#012e7#012btime 2026-02-02T09:41:07:213917+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:06.101697+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:07 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:07 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:07 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:07 np0005604791 systemd[1]: Starting Ceph mds.cephfs.compute-1.khfsen for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:41:07 np0005604791 podman[85383]: 2026-02-02 09:41:07.841635738 +0000 UTC m=+0.048576015 container create d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:41:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:07 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.khfsen supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:07 np0005604791 podman[85383]: 2026-02-02 09:41:07.910683044 +0000 UTC m=+0.117623381 container init d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:07 np0005604791 podman[85383]: 2026-02-02 09:41:07.818473545 +0000 UTC m=+0.025413892 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:41:07 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Feb  2 04:41:07 np0005604791 podman[85383]: 2026-02-02 09:41:07.918671842 +0000 UTC m=+0.125612159 container start d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb  2 04:41:07 np0005604791 bash[85383]: d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0
Feb  2 04:41:07 np0005604791 systemd[1]: Started Ceph mds.cephfs.compute-1.khfsen for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:07 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Feb  2 04:41:07 np0005604791 ceph-mds[85402]: set uid:gid to 167:167 (ceph:ceph)
Feb  2 04:41:07 np0005604791 ceph-mds[85402]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Feb  2 04:41:07 np0005604791 ceph-mds[85402]: main not setting numa affinity
Feb  2 04:41:07 np0005604791 ceph-mds[85402]: pidfile_write: ignore empty --pid-file
Feb  2 04:41:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen[85398]: starting mds.cephfs.compute-1.khfsen at 
Feb  2 04:41:07 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 7 from mon.2
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.0.0.compute-1.mhzhsx
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e8 new map
Feb  2 04:41:08 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e8 print_map#012e8#012btime 2026-02-02T09:41:08:229569+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:06.101697+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:08 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 8 from mon.2
Feb  2 04:41:08 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Monitors have assigned me to become a standby
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.840992434 +0000 UTC m=+0.046761167 container create 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True)
Feb  2 04:41:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:08.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:08 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.f scrub starts
Feb  2 04:41:08 np0005604791 systemd[1]: Started libpod-conmon-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope.
Feb  2 04:41:08 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.f scrub ok
Feb  2 04:41:08 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.822900474 +0000 UTC m=+0.028669237 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.926627742 +0000 UTC m=+0.132396515 container init 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.936767966 +0000 UTC m=+0.142536729 container start 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:08.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.940341969 +0000 UTC m=+0.146110702 container attach 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:08 np0005604791 adoring_nightingale[85530]: 167 167
Feb  2 04:41:08 np0005604791 systemd[1]: libpod-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope: Deactivated successfully.
Feb  2 04:41:08 np0005604791 conmon[85530]: conmon 1b5e2b32bee1d3686e46 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope/container/memory.events
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.945682578 +0000 UTC m=+0.151451311 container died 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:41:08 np0005604791 systemd[1]: var-lib-containers-storage-overlay-044e511332105de50e14cd72a355e14173d59e72e6b82d9b0bb0fba72e34493a-merged.mount: Deactivated successfully.
Feb  2 04:41:08 np0005604791 podman[85513]: 2026-02-02 09:41:08.982937277 +0000 UTC m=+0.188706030 container remove 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:41:08 np0005604791 systemd[1]: libpod-conmon-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope: Deactivated successfully.
Feb  2 04:41:09 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: Bind address in nfs.cephfs.0.0.compute-1.mhzhsx's ganesha conf is defaulting to empty
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: Deploying daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1
Feb  2 04:41:09 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e9 new map
Feb  2 04:41:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e9 print_map#012e9#012btime 2026-02-02T09:41:09:317331+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:09.133293+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:09 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:41:09 np0005604791 podman[85673]: 2026-02-02 09:41:09.776624623 +0000 UTC m=+0.057330372 container create b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Feb  2 04:41:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Feb  2 04:41:09 np0005604791 podman[85673]: 2026-02-02 09:41:09.833074402 +0000 UTC m=+0.113780181 container init b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True)
Feb  2 04:41:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Feb  2 04:41:09 np0005604791 podman[85673]: 2026-02-02 09:41:09.84222223 +0000 UTC m=+0.122927979 container start b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:41:09 np0005604791 podman[85673]: 2026-02-02 09:41:09.750632597 +0000 UTC m=+0.031338416 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:41:09 np0005604791 bash[85673]: b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e
Feb  2 04:41:09 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:41:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.1.0.compute-2.dciyfa
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb  2 04:41:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:10.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:10 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.b deep-scrub starts
Feb  2 04:41:10 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.b deep-scrub ok
Feb  2 04:41:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e10 new map
Feb  2 04:41:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e10 print_map#012e10#012btime 2026-02-02T09:41:11:335086+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:09.133293+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Feb  2 04:41:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Feb  2 04:41:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e11 new map
Feb  2 04:41:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).mds e11 print_map#012e11#012btime 2026-02-02T09:41:12:473139+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-02T09:40:48.656583+0000#012modified#0112026-02-02T09:41:09.133293+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24310}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24310 members: 24310#012[mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb  2 04:41:12 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 11 from mon.2
Feb  2 04:41:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Feb  2 04:41:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Feb  2 04:41:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:12.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:12.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:41:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:41:13 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:13 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb  2 04:41:13 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb  2 04:41:13 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:41:13 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:41:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Feb  2 04:41:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Feb  2 04:41:14 np0005604791 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb  2 04:41:14 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.1.0.compute-2.dciyfa-rgw
Feb  2 04:41:14 np0005604791 ceph-mon[80115]: Bind address in nfs.cephfs.1.0.compute-2.dciyfa's ganesha conf is defaulting to empty
Feb  2 04:41:14 np0005604791 ceph-mon[80115]: Deploying daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2
Feb  2 04:41:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:14.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:41:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:41:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:41:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:41:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.2.0.compute-0.fdwwab
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb  2 04:41:15 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb  2 04:41:16 np0005604791 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb  2 04:41:16 np0005604791 ceph-mon[80115]: Creating key for client.nfs.cephfs.2.0.compute-0.fdwwab-rgw
Feb  2 04:41:16 np0005604791 ceph-mon[80115]: Bind address in nfs.cephfs.2.0.compute-0.fdwwab's ganesha conf is defaulting to empty
Feb  2 04:41:16 np0005604791 ceph-mon[80115]: Deploying daemon nfs.cephfs.2.0.compute-0.fdwwab on compute-0
Feb  2 04:41:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:16.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:16.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:41:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:41:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:41:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:41:17 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:17 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:17 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:17 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:17 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:18 np0005604791 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-1.sryqbx on compute-1
Feb  2 04:41:18 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:18.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:18.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:19 np0005604791 podman[85832]: 2026-02-02 09:41:19.933067386 +0000 UTC m=+2.188622894 container create 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:19 np0005604791 systemd[1]: Started libpod-conmon-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope.
Feb  2 04:41:19 np0005604791 podman[85832]: 2026-02-02 09:41:19.918002274 +0000 UTC m=+2.173557812 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Feb  2 04:41:20 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:41:20 np0005604791 podman[85832]: 2026-02-02 09:41:20.01855902 +0000 UTC m=+2.274114548 container init 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:20 np0005604791 podman[85832]: 2026-02-02 09:41:20.026217389 +0000 UTC m=+2.281772897 container start 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:20 np0005604791 podman[85832]: 2026-02-02 09:41:20.029070243 +0000 UTC m=+2.284625751 container attach 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:20 np0005604791 nervous_leakey[85947]: 0 0
Feb  2 04:41:20 np0005604791 systemd[1]: libpod-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope: Deactivated successfully.
Feb  2 04:41:20 np0005604791 podman[85832]: 2026-02-02 09:41:20.032441821 +0000 UTC m=+2.287997319 container died 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:20 np0005604791 systemd[1]: var-lib-containers-storage-overlay-6b69922dac310fe717a1652596151c0046bd7ee0432f5a90e39eef0cc5c78c16-merged.mount: Deactivated successfully.
Feb  2 04:41:20 np0005604791 podman[85832]: 2026-02-02 09:41:20.064900665 +0000 UTC m=+2.320456183 container remove 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb  2 04:41:20 np0005604791 systemd[1]: libpod-conmon-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope: Deactivated successfully.
Feb  2 04:41:20 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:20 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:20 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:20 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:20 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:20 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:20 np0005604791 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.sryqbx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:41:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:20.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:20 np0005604791 podman[86095]: 2026-02-02 09:41:20.91772211 +0000 UTC m=+0.041358277 container create 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:41:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:20 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cd7be9a1d6d329ee96cef4a74a73dfb17663a86205b28cdea15cb6da056cb0e/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:20 np0005604791 podman[86095]: 2026-02-02 09:41:20.981698415 +0000 UTC m=+0.105334582 container init 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:41:20 np0005604791 podman[86095]: 2026-02-02 09:41:20.987191467 +0000 UTC m=+0.110827634 container start 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:41:20 np0005604791 bash[86095]: 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2
Feb  2 04:41:20 np0005604791 podman[86095]: 2026-02-02 09:41:20.901159619 +0000 UTC m=+0.024795776 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Feb  2 04:41:20 np0005604791 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.sryqbx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094120 (2) : New worker #1 (4) forked
Feb  2 04:41:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:21 : epoch 69807135 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9dc000df0 fd 37 proxy ignored for local
Feb  2 04:41:21 np0005604791 kernel: ganesha.nfsd[86123]: segfault at 50 ip 00007fba5d3d632e sp 00007fb9c67fb210 error 4 in libntirpc.so.5.8[7fba5d3bb000+2c000] likely on CPU 0 (core 0, socket 0)
Feb  2 04:41:21 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:41:21 np0005604791 systemd[1]: Created slice Slice /system/systemd-coredump.
Feb  2 04:41:21 np0005604791 systemd[1]: Started Process Core Dump (PID 86126/UID 0).
Feb  2 04:41:22 np0005604791 systemd-coredump[86127]: Process 85692 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 52:#012#0  0x00007fba5d3d632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fba5d3e0900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Feb  2 04:41:22 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:22 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:22 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:22 np0005604791 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-0.ooxkuo on compute-0
Feb  2 04:41:22 np0005604791 systemd[1]: systemd-coredump@0-86126-0.service: Deactivated successfully.
Feb  2 04:41:22 np0005604791 podman[86132]: 2026-02-02 09:41:22.15750653 +0000 UTC m=+0.038033820 container died b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:41:22 np0005604791 systemd[1]: var-lib-containers-storage-overlay-4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb-merged.mount: Deactivated successfully.
Feb  2 04:41:22 np0005604791 podman[86132]: 2026-02-02 09:41:22.206793042 +0000 UTC m=+0.087320332 container remove b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:41:22 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:41:22 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:41:22 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.148s CPU time.
Feb  2 04:41:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:22.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:22.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:23 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:23 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:23 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:23 np0005604791 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-2.arssaq on compute-2
Feb  2 04:41:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:24.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:24 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:24 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:24 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:24 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:25 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb  2 04:41:25 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb  2 04:41:25 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb  2 04:41:25 np0005604791 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-2.tgzfzm on compute-2
Feb  2 04:41:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:26.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:26.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094127 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:41:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:28.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:28.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-0.pqolko on compute-0
Feb  2 04:41:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:30.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:30.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 1.
Feb  2 04:41:32 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.148s CPU time.
Feb  2 04:41:32 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:41:32 np0005604791 podman[86221]: 2026-02-02 09:41:32.79561145 +0000 UTC m=+0.050065983 container create 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:41:32 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:32 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:32 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:32 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:32 np0005604791 podman[86221]: 2026-02-02 09:41:32.852428008 +0000 UTC m=+0.106882511 container init 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:41:32 np0005604791 podman[86221]: 2026-02-02 09:41:32.861841483 +0000 UTC m=+0.116296026 container start 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Feb  2 04:41:32 np0005604791 bash[86221]: 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8
Feb  2 04:41:32 np0005604791 podman[86221]: 2026-02-02 09:41:32.772103219 +0000 UTC m=+0.026557802 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:41:32 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:41:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:32.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:41:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb  2 04:41:34 np0005604791 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-1.whrwoq on compute-1
Feb  2 04:41:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:41:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:34.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:41:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:41:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:41:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:36.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.32210531 +0000 UTC m=+3.129237753 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.423232126 +0000 UTC m=+3.230364529 container create 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, release=1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Feb  2 04:41:37 np0005604791 systemd[1]: Started libpod-conmon-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope.
Feb  2 04:41:37 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.525491409 +0000 UTC m=+3.332623862 container init 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, version=2.2.4, architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, release=1793, io.buildah.version=1.28.2, distribution-scope=public)
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.534041659 +0000 UTC m=+3.341174062 container start 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.buildah.version=1.28.2, name=keepalived, io.openshift.expose-services=, description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb  2 04:41:37 np0005604791 nervous_goodall[86467]: 0 0
Feb  2 04:41:37 np0005604791 systemd[1]: libpod-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope: Deactivated successfully.
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.546306811 +0000 UTC m=+3.353439264 container attach 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.openshift.expose-services=, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, description=keepalived for Ceph, io.buildah.version=1.28.2, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.547546251 +0000 UTC m=+3.354678644 container died 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=Ceph keepalived, release=1793, version=2.2.4, build-date=2023-02-22T09:23:20, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.openshift.expose-services=)
Feb  2 04:41:37 np0005604791 systemd[1]: var-lib-containers-storage-overlay-f0b1f7f70685725d38e8079ceab8f79a91a3e9352fcb7d96bcfa543c847967f2-merged.mount: Deactivated successfully.
Feb  2 04:41:37 np0005604791 podman[86370]: 2026-02-02 09:41:37.619782517 +0000 UTC m=+3.426914920 container remove 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2)
Feb  2 04:41:37 np0005604791 systemd[1]: libpod-conmon-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope: Deactivated successfully.
Feb  2 04:41:37 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:37 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:37 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:38 np0005604791 systemd[1]: Reloading.
Feb  2 04:41:38 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:41:38 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:41:38 np0005604791 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.whrwoq for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:41:38 np0005604791 podman[86616]: 2026-02-02 09:41:38.552635525 +0000 UTC m=+0.055493924 container create 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, name=keepalived, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, version=2.2.4, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb  2 04:41:38 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/855c0c84ba32e92e80b8de6281d0a005428bc730406eeeb6d1f4ea34daf14c34/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:41:38 np0005604791 podman[86616]: 2026-02-02 09:41:38.528327988 +0000 UTC m=+0.031186407 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Feb  2 04:41:38 np0005604791 podman[86616]: 2026-02-02 09:41:38.622434611 +0000 UTC m=+0.125293040 container init 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.expose-services=, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vcs-type=git, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, architecture=x86_64, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb  2 04:41:38 np0005604791 podman[86616]: 2026-02-02 09:41:38.630621912 +0000 UTC m=+0.133480321 container start 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vcs-type=git, architecture=x86_64, version=2.2.4, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb  2 04:41:38 np0005604791 bash[86616]: 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2
Feb  2 04:41:38 np0005604791 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.whrwoq for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Starting Keepalived v2.2.4 (08/21,2021)
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Running on Linux 5.14.0-665.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026 (built for Linux 5.14.0)
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Configuration file /etc/keepalived/keepalived.conf
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Starting VRRP child process, pid=4
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Startup complete
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: (VI_0) Entering BACKUP STATE (init)
Feb  2 04:41:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: VRRP_Script(check_backend) succeeded
Feb  2 04:41:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:38.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:38 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:38 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:38 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:38 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:38.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:39 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:41:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:39 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:41:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:40.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:40.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:41 np0005604791 ceph-mon[80115]: Deploying daemon alertmanager.compute-0 on compute-0
Feb  2 04:41:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Entering MASTER STATE
Feb  2 04:41:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Feb  2 04:41:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Entering BACKUP STATE
Feb  2 04:41:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:42.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:42.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:44.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:46 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:46.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:47 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:48 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:48 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:41:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:48 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Feb  2 04:41:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:41:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:41:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094149 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:41:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:49 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:49 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:50 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 55 pg[8.0( v 37'12 (0'0,37'12] local-lis/les=36/37 n=6 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55 pruub=10.417726517s) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 37'11 active pruub 171.950485229s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:41:50 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 55 pg[8.0( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55 pruub=10.417726517s) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 0'0 unknown pruub 171.950485229s@ mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:50 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:50 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08001f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Feb  2 04:41:50 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:51.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:51 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.14( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.15( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.16( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.17( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.10( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.2( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.3( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.f( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.11( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.8( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.9( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.a( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.d( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.c( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.b( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1( v 37'12 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.7( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.6( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.e( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.4( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1b( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.5( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1a( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.19( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.18( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1e( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1d( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1c( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1f( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.13( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.12( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.0( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.7( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.e( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.13( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1e( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: Regenerating cephadm self-signed grafana TLS certificates
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: Deploying daemon grafana.compute-0 on compute-0
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:41:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:52 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 57 pg[9.0( v 44'1041 (0'0,44'1041] local-lis/les=38/39 n=178 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57 pruub=10.447933197s) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 44'1040 active pruub 174.220642090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 57 pg[9.0( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57 pruub=10.447933197s) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 0'0 unknown pruub 174.220642090s@ mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dab2e8 space 0x5616e0cb91f0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6848 space 0x5616e0db21b0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dabc48 space 0x5616e0bd4de0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc76a8 space 0x5616e0bd4d10 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8c708 space 0x5616e0db2420 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0b89f68 space 0x5616e0cb9120 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8d888 space 0x5616e0db2d10 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0df65c8 space 0x5616e0c6d460 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7b08 space 0x5616e0db20e0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e077fce8 space 0x5616e0db2aa0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dabb08 space 0x5616e070b7a0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8d4c8 space 0x5616e0db25c0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8cc08 space 0x5616e0db24f0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daa668 space 0x5616e0c18350 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6c08 space 0x5616e0db2280 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dab108 space 0x5616e070b870 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9d108 space 0x5616e0db2830 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7568 space 0x5616e05c9a10 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daaca8 space 0x5616e0db3c80 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7e28 space 0x5616e0bd4eb0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0db5608 space 0x5616e0c97e20 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8cca8 space 0x5616e0bda420 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0b08848 space 0x5616e0cb9050 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0db9b08 space 0x5616e0cb9c80 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9dba8 space 0x5616e0db2c40 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9cb68 space 0x5616e0db2b70 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6028 space 0x5616e0bd5600 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9cac8 space 0x5616e0db2760 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daa488 space 0x5616e0cb92c0 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9c5c8 space 0x5616e0db2690 0x0~1000 clean)
Feb  2 04:41:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:52 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:53.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:53 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08001f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Feb  2 04:41:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:41:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:41:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:41:53 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Feb  2 04:41:53 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.15( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.14( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.17( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.16( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.11( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.10( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.3( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.2( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.9( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.8( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.c( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.d( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.6( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.7( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.5( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.4( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.18( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.19( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1c( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1d( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.12( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.13( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.14( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.2( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.0( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.c( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.5( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.4( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1c( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:53 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:54 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Feb  2 04:41:54 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Feb  2 04:41:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:54 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:54 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Feb  2 04:41:54 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 59 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59 pruub=12.393096924s) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active pruub 178.257980347s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:41:54 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 59 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59 pruub=12.393096924s) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown pruub 178.257980347s@ mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Feb  2 04:41:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:41:54 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb  2 04:41:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:54 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:55 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Feb  2 04:41:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:41:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.17( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.16( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.15( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.13( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.12( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.c( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.b( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.a( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.14( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.9( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.d( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.e( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.f( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.8( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.2( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.3( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.4( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.5( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.6( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.7( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.18( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.19( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1a( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1c( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1b( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1d( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1e( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1f( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.11( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.10( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.15( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.0( empty local-lis/les=59/60 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.9( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.2( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.5( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.6( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.18( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.11( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.10( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:41:55 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:41:55 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Feb  2 04:41:56 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Feb  2 04:41:56 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Feb  2 04:41:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:56 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08008dc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:56 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:56 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:41:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:41:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:41:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:41:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:57.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:41:57 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Feb  2 04:41:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:57 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:57 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Feb  2 04:41:58 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.8 deep-scrub starts
Feb  2 04:41:58 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.8 deep-scrub ok
Feb  2 04:41:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:58 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:58 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08008f40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:58.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:41:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:41:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:41:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:59 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:41:59 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Feb  2 04:41:59 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.609076500s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.828979492s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.609023094s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.828979492s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.121712685s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.341796875s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.613359451s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833557129s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.121651649s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.341796875s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.125307083s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345596313s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.613282204s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833557129s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.125280380s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345596313s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.614042282s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834625244s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.614006996s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834625244s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612858772s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833755493s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124693871s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345626831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612816811s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833755493s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124675751s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345657349s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124633789s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345626831s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124631882s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345657349s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124475479s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345626831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124404907s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345626831s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612273216s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833572388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612373352s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833770752s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124789238s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346160889s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612235069s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833572388s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612346649s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833770752s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124505997s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345977783s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124707222s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346160889s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124474525s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345977783s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124224663s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345855713s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124885559s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346649170s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124187469s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345855713s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124848366s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346649170s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124327660s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346328735s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611934662s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833969116s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611896515s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833969116s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124028206s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346130371s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124276161s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346328735s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124288559s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346420288s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123966217s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346130371s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124264717s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346420288s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123930931s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346435547s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611637115s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834091187s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611498833s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834075928s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123887062s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346435547s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123829842s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346450806s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611479759s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834091187s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611470222s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834075928s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123803139s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346450806s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611256599s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834030151s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123665810s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346466064s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123608589s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346466064s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611164093s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834030151s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611262321s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611124992s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611133575s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834335327s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611044884s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834335327s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.5( v 60'1 (0'0,60'1] local-lis/les=59/60 n=1 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611040115s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=60'1 lcod 0'0 mlcod 0'0 active pruub 182.834381104s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123147011s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346542358s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.5( v 60'1 (0'0,60'1] local-lis/les=59/60 n=1 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611004829s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=60'1 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.834381104s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123107910s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346542358s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610848427s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834472656s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610822678s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834472656s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122960091s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346664429s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123128891s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346801758s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122926712s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346664429s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123043060s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346801758s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610724449s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834594727s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122791290s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346664429s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610702515s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834594727s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122735977s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346664429s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610624313s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834655762s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122812271s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346878052s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610590935s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834655762s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610517502s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834686279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122742653s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346878052s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122614861s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346893311s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610429764s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834686279s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610346794s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834655762s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122591019s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346893311s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610308647s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834655762s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124456406s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.348846436s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610197067s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834686279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124374390s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.348846436s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610175133s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834686279s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610151291s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834747314s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610117912s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834747314s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124007225s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.348831177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123960495s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.348831177s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124003410s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.349044800s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123972893s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.349044800s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.10( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.12( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.14( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.2( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.6( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.a( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.c( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.8( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.b( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.8( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.e( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.18( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.1c( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.19( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Feb  2 04:42:00 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:00 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:00 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: Deploying daemon keepalived.rgw.default.compute-2.tapsuz on compute-2
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Feb  2 04:42:00 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:42:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:00.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:01.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:01 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08009860 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.2( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.10( v 60'66 lc 53'46 (0'0,60'66] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=60'66 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.15( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.14( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.13( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.12( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.6( v 54'63 lc 53'43 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.b( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.8( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.a( v 54'63 lc 0'0 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.c( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.8( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.e( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.5( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.18( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.19( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.1c( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.19( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.1b( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.277121) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321277313, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6845, "num_deletes": 259, "total_data_size": 18219860, "memory_usage": 19128080, "flush_reason": "Manual Compaction"}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321370080, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11534958, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6850, "table_properties": {"data_size": 11509030, "index_size": 16342, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8517, "raw_key_size": 83139, "raw_average_key_size": 24, "raw_value_size": 11443688, "raw_average_value_size": 3367, "num_data_blocks": 719, "num_entries": 3398, "num_filter_entries": 3398, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 1770025175, "file_creation_time": 1770025321, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 92992 microseconds, and 26472 cpu microseconds.
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.370174) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11534958 bytes OK
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.370194) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377756) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377773) EVENT_LOG_v1 {"time_micros": 1770025321377769, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377791) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18183110, prev total WAL file size 18183110, number of live WAL files 2.
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.379683) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323535' seq:0, type:0; will stop at (end)
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1648B)]
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321379738, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11536606, "oldest_snapshot_seqno": -1}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3143 keys, 11531437 bytes, temperature: kUnknown
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321546606, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11531437, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11506138, "index_size": 16358, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7877, "raw_key_size": 79606, "raw_average_key_size": 25, "raw_value_size": 11443968, "raw_average_value_size": 3641, "num_data_blocks": 718, "num_entries": 3143, "num_filter_entries": 3143, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025321, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.547189) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11531437 bytes
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.549844) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.0 rd, 69.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.0, 0.0 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3403, records dropped: 260 output_compression: NoCompression
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.549873) EVENT_LOG_v1 {"time_micros": 1770025321549860, "job": 4, "event": "compaction_finished", "compaction_time_micros": 167196, "compaction_time_cpu_micros": 16046, "output_level": 6, "num_output_files": 1, "total_output_size": 11531437, "num_input_records": 3403, "num_output_records": 3143, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321551445, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321551499, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.379581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:01 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Feb  2 04:42:02 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Feb  2 04:42:02 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Feb  2 04:42:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:02 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:02 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: Deploying daemon keepalived.rgw.default.compute-0.pxmjnp on compute-0
Feb  2 04:42:02 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Feb  2 04:42:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:03.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:03 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.606101990s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751937866s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605978966s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751983643s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605916023s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751983643s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605821609s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751937866s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609919548s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756454468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609889030s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756454468s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609468460s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756454468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609417915s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756454468s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609522820s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756866455s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609489441s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756866455s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609068871s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.757019043s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608972549s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.757019043s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608766556s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756958008s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608687401s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756958008s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.614995003s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.763641357s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:03 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.614953041s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.763641357s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:03 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:04 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.6 scrub starts
Feb  2 04:42:04 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.6 scrub ok
Feb  2 04:42:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:04 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08009860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:04 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:04.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:04 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Feb  2 04:42:04 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.15 scrub starts
Feb  2 04:42:04 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.15 scrub ok
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:05.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:05 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.a deep-scrub starts
Feb  2 04:42:05 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.a deep-scrub ok
Feb  2 04:42:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Feb  2 04:42:06 np0005604791 ceph-mon[80115]: Deploying daemon prometheus.compute-0 on compute-0
Feb  2 04:42:06 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.793918610s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273803711s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.793812752s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273803711s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782711983s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.263900757s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782621384s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.263900757s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782208443s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.263931274s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782091141s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.263931274s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:06 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:06 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e0800a180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Feb  2 04:42:06 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Feb  2 04:42:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:06.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:07.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:07 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e0800a180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Feb  2 04:42:07 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.790109634s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273849487s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.790036201s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273849487s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.789252281s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273773193s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.789169312s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273773193s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788814545s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273757935s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788699150s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273757935s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788599014s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.274124146s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788511276s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.274124146s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.787535667s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273727417s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.787388802s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273727417s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.0 scrub starts
Feb  2 04:42:07 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.0 scrub ok
Feb  2 04:42:08 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Feb  2 04:42:08 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Feb  2 04:42:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:08 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de8000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:08 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.721615791s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751953125s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.721550941s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751953125s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.725348473s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756683350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.725308418s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756683350s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.724955559s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=58'1042 lcod 59'1043 mlcod 59'1043 active pruub 188.756774902s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.724844933s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 unknown NOTIFY pruub 188.756774902s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.731335640s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.763656616s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.731319427s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.763656616s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.c scrub starts
Feb  2 04:42:08 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.c scrub ok
Feb  2 04:42:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:42:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:08.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:42:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:09.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:09 np0005604791 kernel: ganesha.nfsd[86643]: segfault at 50 ip 00007f9e9266932e sp 00007f9e1e7fb210 error 4 in libntirpc.so.5.8[7f9e9264e000+2c000] likely on CPU 2 (core 0, socket 2)
Feb  2 04:42:09 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:42:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:09 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 48 proxy ignored for local
Feb  2 04:42:09 np0005604791 systemd[1]: Started Process Core Dump (PID 86673/UID 0).
Feb  2 04:42:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Feb  2 04:42:09 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.b scrub starts
Feb  2 04:42:09 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.b scrub ok
Feb  2 04:42:09 np0005604791 systemd-coredump[86674]: Process 86240 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f9e9266932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 04:42:09 np0005604791 systemd[1]: systemd-coredump@1-86673-0.service: Deactivated successfully.
Feb  2 04:42:10 np0005604791 podman[86679]: 2026-02-02 09:42:10.020227001 +0000 UTC m=+0.024935834 container died 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Feb  2 04:42:10 np0005604791 systemd[1]: var-lib-containers-storage-overlay-fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf-merged.mount: Deactivated successfully.
Feb  2 04:42:10 np0005604791 podman[86679]: 2026-02-02 09:42:10.201033035 +0000 UTC m=+0.205741808 container remove 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb  2 04:42:10 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:42:10 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:42:10 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.105s CPU time.
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Feb  2 04:42:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:10.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:11.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.9 scrub starts
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.9 scrub ok
Feb  2 04:42:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.397579193s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.752578735s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.397531509s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.752578735s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400834084s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.756729126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400775909s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.756729126s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.399782181s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.756805420s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.399748802s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.756805420s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400461197s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.757797241s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400429726s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.757797241s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=70/71 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=59'1044 lcod 59'1043 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb  2 04:42:11 np0005604791 systemd[1]: session-34.scope: Deactivated successfully.
Feb  2 04:42:11 np0005604791 systemd[1]: session-34.scope: Consumed 18.173s CPU time.
Feb  2 04:42:11 np0005604791 systemd-logind[805]: Session 34 logged out. Waiting for processes to exit.
Feb  2 04:42:11 np0005604791 systemd-logind[805]: Removed session 34.
Feb  2 04:42:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb  2 04:42:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb  2 04:42:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=72 pruub=15.885287285s) [2] async=[2] r=-1 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.365600586s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=72 pruub=15.885204315s) [2] r=-1 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.365600586s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:11 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb  2 04:42:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:11.415+0000 7f8e1069c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb  2 04:42:11 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Feb  2 04:42:11 np0005604791 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Feb  2 04:42:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:11.493+0000 7f8e1069c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb  2 04:42:11 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.d scrub starts
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.d scrub ok
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb  2 04:42:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:12.252+0000 7f8e1069c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb  2 04:42:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.876825333s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.372344971s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.876741409s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.372344971s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.875668526s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.372756958s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.875595093s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.372756958s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.5( v 72'1048 (0'0,72'1048] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.874918938s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=59'1044 lcod 72'1047 mlcod 72'1047 active pruub 198.372436523s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.5( v 72'1048 (0'0,72'1048] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.874726295s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=59'1044 lcod 72'1047 mlcod 0'0 unknown NOTIFY pruub 198.372436523s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:12 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb  2 04:42:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:12.900+0000 7f8e1069c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb  2 04:42:12 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb  2 04:42:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:12.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:  from numpy import show_config as show_numpy_config
Feb  2 04:42:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:13.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.042+0000 7f8e1069c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.105+0000 7f8e1069c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.e scrub starts
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.e scrub ok
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb  2 04:42:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.244+0000 7f8e1069c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb  2 04:42:13 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.008621216s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.507293701s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.008448601s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.507293701s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.033875465s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532867432s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.033797264s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532867432s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032817841s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532943726s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032670975s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532943726s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032168388s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532821655s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:13 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032093048s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532821655s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb  2 04:42:13 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb  2 04:42:14 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.2 scrub starts
Feb  2 04:42:14 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.2 scrub ok
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.168+0000 7f8e1069c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb  2 04:42:14 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.382+0000 7f8e1069c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.449+0000 7f8e1069c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.506+0000 7f8e1069c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.577+0000 7f8e1069c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.643+0000 7f8e1069c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb  2 04:42:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.952+0000 7f8e1069c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb  2 04:42:14 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb  2 04:42:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:15.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.043+0000 7f8e1069c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb  2 04:42:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094215 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:42:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1 scrub starts
Feb  2 04:42:15 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1 scrub ok
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb  2 04:42:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.425+0000 7f8e1069c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb  2 04:42:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.912+0000 7f8e1069c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb  2 04:42:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.976+0000 7f8e1069c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb  2 04:42:15 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.047+0000 7f8e1069c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb  2 04:42:16 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Feb  2 04:42:16 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.175+0000 7f8e1069c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.236+0000 7f8e1069c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.369+0000 7f8e1069c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.561+0000 7f8e1069c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.786+0000 7f8e1069c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.845+0000 7f8e1069c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: mgr load Constructed class from module: prometheus
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55a32753d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO root] server_addr: :: server_port: 9283
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO root] Starting engine...
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Bus STARTING
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Bus STARTING
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: CherryPy Checker:
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: The Application mounted at '' has an empty config.
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 
Feb  2 04:42:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Serving on http://:::9283
Feb  2 04:42:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Bus STARTED
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Serving on http://:::9283
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Bus STARTED
Feb  2 04:42:16 np0005604791 ceph-mgr[80422]: [prometheus INFO root] Engine started.
Feb  2 04:42:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Feb  2 04:42:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:17.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:17 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Feb  2 04:42:17 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb  2 04:42:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb  2 04:42:17 np0005604791 systemd-logind[805]: New session 36 of user ceph-admin.
Feb  2 04:42:17 np0005604791 systemd[1]: Started Session 36 of User ceph-admin.
Feb  2 04:42:18 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Feb  2 04:42:18 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Feb  2 04:42:18 np0005604791 podman[86905]: 2026-02-02 09:42:18.337779135 +0000 UTC m=+0.069156590 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Feb  2 04:42:18 np0005604791 podman[86905]: 2026-02-02 09:42:18.418198361 +0000 UTC m=+0.149575806 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb  2 04:42:18 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Bus STARTING
Feb  2 04:42:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:18.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:18 np0005604791 podman[87046]: 2026-02-02 09:42:18.999308324 +0000 UTC m=+0.065465390 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:42:19 np0005604791 podman[87046]: 2026-02-02 09:42:19.010639243 +0000 UTC m=+0.076796249 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:42:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Feb  2 04:42:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:42:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:19.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:42:19 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Feb  2 04:42:19 np0005604791 podman[87166]: 2026-02-02 09:42:19.37790524 +0000 UTC m=+0.059805471 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:42:19 np0005604791 podman[87166]: 2026-02-02 09:42:19.386779738 +0000 UTC m=+0.068679869 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:42:19 np0005604791 podman[87230]: 2026-02-02 09:42:19.582483468 +0000 UTC m=+0.062611200 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, name=keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, vendor=Red Hat, Inc., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb  2 04:42:19 np0005604791 podman[87230]: 2026-02-02 09:42:19.596718008 +0000 UTC m=+0.076845650 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, build-date=2023-02-22T09:23:20, architecture=x86_64, version=2.2.4, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, description=keepalived for Ceph)
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Serving on https://192.168.122.100:7150
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Client ('192.168.122.100', 35430) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Serving on http://192.168.122.100:8765
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Bus STARTED
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.775767) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339775840, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 905, "num_deletes": 251, "total_data_size": 3412189, "memory_usage": 3570384, "flush_reason": "Manual Compaction"}
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339814911, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 2243757, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6855, "largest_seqno": 7755, "table_properties": {"data_size": 2239090, "index_size": 2123, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12500, "raw_average_key_size": 21, "raw_value_size": 2228756, "raw_average_value_size": 3803, "num_data_blocks": 92, "num_entries": 586, "num_filter_entries": 586, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025321, "oldest_key_time": 1770025321, "file_creation_time": 1770025339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 39235 microseconds, and 4221 cpu microseconds.
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.815014) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 2243757 bytes OK
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.815035) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843625) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843699) EVENT_LOG_v1 {"time_micros": 1770025339843683, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843734) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3407067, prev total WAL file size 3435163, number of live WAL files 2.
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.844632) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(2191KB)], [15(10MB)]
Feb  2 04:42:19 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339844702, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13775194, "oldest_snapshot_seqno": -1}
Feb  2 04:42:20 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1a scrub starts
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3197 keys, 12428737 bytes, temperature: kUnknown
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340033883, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12428737, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12403370, "index_size": 16298, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 82723, "raw_average_key_size": 25, "raw_value_size": 12340273, "raw_average_value_size": 3859, "num_data_blocks": 707, "num_entries": 3197, "num_filter_entries": 3197, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.034225) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12428737 bytes
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.036157) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 72.8 rd, 65.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(11.7) write-amplify(5.5) OK, records in: 3729, records dropped: 532 output_compression: NoCompression
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.036189) EVENT_LOG_v1 {"time_micros": 1770025340036174, "job": 6, "event": "compaction_finished", "compaction_time_micros": 189277, "compaction_time_cpu_micros": 22185, "output_level": 6, "num_output_files": 1, "total_output_size": 12428737, "num_input_records": 3729, "num_output_records": 3197, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340036608, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340037865, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.844532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037968) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:42:20 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1a scrub ok
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:20 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 2.
Feb  2 04:42:20 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:42:20 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.105s CPU time.
Feb  2 04:42:20 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Feb  2 04:42:20 np0005604791 podman[87440]: 2026-02-02 09:42:20.79212459 +0000 UTC m=+0.042435084 container create fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:42:20 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:42:20 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:42:20 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:42:20 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:42:20 np0005604791 podman[87440]: 2026-02-02 09:42:20.850487714 +0000 UTC m=+0.100798218 container init fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Feb  2 04:42:20 np0005604791 podman[87440]: 2026-02-02 09:42:20.855534978 +0000 UTC m=+0.105845462 container start fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:42:20 np0005604791 bash[87440]: fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9
Feb  2 04:42:20 np0005604791 podman[87440]: 2026-02-02 09:42:20.775405089 +0000 UTC m=+0.025715623 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:42:20 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:42:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:42:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:21 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1e scrub starts
Feb  2 04:42:21 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1e scrub ok
Feb  2 04:42:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:21.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:21 np0005604791 systemd-logind[805]: New session 37 of user zuul.
Feb  2 04:42:21 np0005604791 systemd[1]: Started Session 37 of User zuul.
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:21 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.1f deep-scrub starts
Feb  2 04:42:22 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.1f deep-scrub ok
Feb  2 04:42:22 np0005604791 python3.9[87667]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 04:42:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:42:22 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1d scrub starts
Feb  2 04:42:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:22.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:22 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1d scrub ok
Feb  2 04:42:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051406860s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.756744385s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051076889s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.756744385s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051866531s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.757888794s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051836014s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.757888794s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.832950592s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.756683350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.832920074s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.756683350s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.833053589s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.757949829s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.833026886s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.757949829s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.10 scrub starts
Feb  2 04:42:23 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.10 scrub ok
Feb  2 04:42:24 np0005604791 python3.9[88600]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] async=[2] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] async=[2] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:24.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.13 scrub starts
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.13 scrub ok
Feb  2 04:42:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:25.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:42:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=15.000723839s) [2] async=[2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 211.974411011s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783122063s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.757095337s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=15.000452995s) [2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 211.974411011s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783082008s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.757095337s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783574104s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.758117676s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783516884s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.758117676s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=14.999728203s) [2] async=[2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 211.974380493s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=14.999654770s) [2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 211.974380493s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] async=[2] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] async=[2] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Feb  2 04:42:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.465485573s) [2] async=[2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.981048584s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.465374947s) [2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.981048584s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.467186928s) [2] async=[2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.984359741s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:26 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.467089653s) [2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.984359741s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Feb  2 04:42:26 np0005604791 ceph-mon[80115]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb  2 04:42:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:42:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:26.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:42:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:42:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:42:27 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.2 scrub starts
Feb  2 04:42:27 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.2 scrub ok
Feb  2 04:42:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:27.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:28 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Feb  2 04:42:28 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Feb  2 04:42:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Feb  2 04:42:28 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 83 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:28 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Feb  2 04:42:28 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 83 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:28.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:28 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.b scrub starts
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.b scrub ok
Feb  2 04:42:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:29.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:29 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Feb  2 04:42:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.974443436s) [1] async=[1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 215.460998535s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.974328041s) [1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 215.460998535s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.982478142s) [1] async=[1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 215.469451904s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.982404709s) [1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 215.469451904s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Feb  2 04:42:29 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Feb  2 04:42:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Feb  2 04:42:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:30 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.8 scrub starts
Feb  2 04:42:30 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.8 scrub ok
Feb  2 04:42:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:42:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:30.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:42:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb  2 04:42:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:31.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb  2 04:42:31 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:31 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:31 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:31 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.e scrub starts
Feb  2 04:42:31 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.e scrub ok
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: Reconfiguring mon.compute-0 (monmap changed)...
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: Reconfiguring daemon mon.compute-0 on compute-0
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: Reconfiguring mgr.compute-0.djvyfo (monmap changed)...
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.djvyfo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: Reconfiguring daemon mgr.compute-0.djvyfo on compute-0
Feb  2 04:42:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:32 np0005604791 systemd[1]: session-37.scope: Deactivated successfully.
Feb  2 04:42:32 np0005604791 systemd[1]: session-37.scope: Consumed 7.908s CPU time.
Feb  2 04:42:32 np0005604791 systemd-logind[805]: Session 37 logged out. Waiting for processes to exit.
Feb  2 04:42:32 np0005604791 systemd-logind[805]: Removed session 37.
Feb  2 04:42:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:32.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:32 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.c scrub starts
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:42:32 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.c scrub ok
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:42:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:42:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:33.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:33 np0005604791 ceph-mon[80115]: Reconfiguring crash.compute-0 (monmap changed)...
Feb  2 04:42:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb  2 04:42:33 np0005604791 ceph-mon[80115]: Reconfiguring daemon crash.compute-0 on compute-0
Feb  2 04:42:33 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Feb  2 04:42:33 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Feb  2 04:42:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: Reconfiguring osd.1 (monmap changed)...
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: Reconfiguring daemon osd.1 on compute-0
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Feb  2 04:42:34 np0005604791 ceph-mon[80115]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Feb  2 04:42:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:34.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:34 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.18 scrub starts
Feb  2 04:42:34 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.18 scrub ok
Feb  2 04:42:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094235 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:42:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:35.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Feb  2 04:42:35 np0005604791 ceph-mon[80115]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Feb  2 04:42:36 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Feb  2 04:42:36 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Feb  2 04:42:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:36 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Feb  2 04:42:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:36.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:37 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Feb  2 04:42:37 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Feb  2 04:42:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:37.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: Reconfiguring grafana.compute-0 (dependencies changed)...
Feb  2 04:42:37 np0005604791 ceph-mon[80115]: Reconfiguring daemon grafana.compute-0 on compute-0
Feb  2 04:42:38 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.19 scrub starts
Feb  2 04:42:38 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.19 scrub ok
Feb  2 04:42:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Feb  2 04:42:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Feb  2 04:42:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:38.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:39.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:39 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.1c scrub starts
Feb  2 04:42:39 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.1c scrub ok
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.596974171 +0000 UTC m=+0.052074361 container create c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: Reconfiguring crash.compute-1 (monmap changed)...
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb  2 04:42:39 np0005604791 ceph-mon[80115]: Reconfiguring daemon crash.compute-1 on compute-1
Feb  2 04:42:39 np0005604791 systemd[1]: Started libpod-conmon-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope.
Feb  2 04:42:39 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.570855149 +0000 UTC m=+0.025955329 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.675839679 +0000 UTC m=+0.130939869 container init c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.683824896 +0000 UTC m=+0.138925076 container start c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.687582418 +0000 UTC m=+0.142682658 container attach c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:42:39 np0005604791 stupefied_edison[88988]: 167 167
Feb  2 04:42:39 np0005604791 systemd[1]: libpod-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope: Deactivated successfully.
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.690888939 +0000 UTC m=+0.145989129 container died c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:42:39 np0005604791 systemd[1]: var-lib-containers-storage-overlay-d9f3e90accea7336dd75055b02d47aad36752e28d75b34bcc4df3068854b64c9-merged.mount: Deactivated successfully.
Feb  2 04:42:39 np0005604791 podman[88972]: 2026-02-02 09:42:39.735734742 +0000 UTC m=+0.190834922 container remove c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:42:39 np0005604791 systemd[1]: libpod-conmon-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope: Deactivated successfully.
Feb  2 04:42:40 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Feb  2 04:42:40 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.255776273 +0000 UTC m=+0.041044550 container create d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:42:40 np0005604791 systemd[1]: Started libpod-conmon-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope.
Feb  2 04:42:40 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.238415206 +0000 UTC m=+0.023683463 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.343447297 +0000 UTC m=+0.128715614 container init d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.349774573 +0000 UTC m=+0.135042850 container start d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:42:40 np0005604791 distracted_elbakyan[89086]: 167 167
Feb  2 04:42:40 np0005604791 systemd[1]: libpod-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope: Deactivated successfully.
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.354184651 +0000 UTC m=+0.139453048 container attach d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb  2 04:42:40 np0005604791 conmon[89086]: conmon d2b237db8ba932eeec5f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope/container/memory.events
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.355488853 +0000 UTC m=+0.140757140 container died d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:42:40 np0005604791 systemd[1]: var-lib-containers-storage-overlay-38f9f7c4fd5ad40dee5edcb9f87756e146a0d27526016fe6503dd6c79997ea4b-merged.mount: Deactivated successfully.
Feb  2 04:42:40 np0005604791 podman[89069]: 2026-02-02 09:42:40.448868419 +0000 UTC m=+0.234136706 container remove d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:42:40 np0005604791 systemd[1]: libpod-conmon-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope: Deactivated successfully.
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: Reconfiguring osd.0 (monmap changed)...
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: Reconfiguring daemon osd.0 on compute-1
Feb  2 04:42:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Feb  2 04:42:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:42:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:41.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:42:41 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.117435731 +0000 UTC m=+0.060619861 container create daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Feb  2 04:42:41 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Feb  2 04:42:41 np0005604791 systemd[1]: Started libpod-conmon-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope.
Feb  2 04:42:41 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.090502019 +0000 UTC m=+0.033686229 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.197763426 +0000 UTC m=+0.140947586 container init daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.204422429 +0000 UTC m=+0.147606559 container start daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb  2 04:42:41 np0005604791 silly_lewin[89194]: 167 167
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.208179122 +0000 UTC m=+0.151363282 container attach daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb  2 04:42:41 np0005604791 systemd[1]: libpod-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope: Deactivated successfully.
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.209100894 +0000 UTC m=+0.152285034 container died daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb  2 04:42:41 np0005604791 systemd[1]: var-lib-containers-storage-overlay-2b7a7e70d2c6d0e533dfe8ca66f3460c2dd5eeab061c3c6b27a85d610fadc521-merged.mount: Deactivated successfully.
Feb  2 04:42:41 np0005604791 podman[89178]: 2026-02-02 09:42:41.244467004 +0000 UTC m=+0.187651164 container remove daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:42:41 np0005604791 systemd[1]: libpod-conmon-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope: Deactivated successfully.
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: Reconfiguring mon.compute-1 (monmap changed)...
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: Reconfiguring daemon mon.compute-1 on compute-1
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb  2 04:42:42 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Feb  2 04:42:42 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Feb  2 04:42:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: Reconfiguring mon.compute-2 (monmap changed)...
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: Reconfiguring daemon mon.compute-2 on compute-2
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: Reconfiguring mgr.compute-2.gzlyac (monmap changed)...
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gzlyac", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: Reconfiguring daemon mgr.compute-2.gzlyac on compute-2
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:42.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:43.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:43 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Feb  2 04:42:43 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Feb  2 04:42:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094243 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:42:43 np0005604791 ceph-mon[80115]: Reconfiguring haproxy.rgw.default.compute-2.txhwfs (unknown last config time)...
Feb  2 04:42:43 np0005604791 ceph-mon[80115]: Reconfiguring daemon haproxy.rgw.default.compute-2.txhwfs on compute-2
Feb  2 04:42:44 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.c scrub starts
Feb  2 04:42:44 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.c scrub ok
Feb  2 04:42:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:44.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Feb  2 04:42:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:42:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:45.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:42:45 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.0 deep-scrub starts
Feb  2 04:42:45 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.0 deep-scrub ok
Feb  2 04:42:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Feb  2 04:42:45 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Feb  2 04:42:46 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1 scrub starts
Feb  2 04:42:46 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1 scrub ok
Feb  2 04:42:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:42:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:46.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:42:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:46 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Feb  2 04:42:46 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:46 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:47 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Feb  2 04:42:47 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Feb  2 04:42:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Feb  2 04:42:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Feb  2 04:42:48 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Feb  2 04:42:48 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Feb  2 04:42:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:48.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:42:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Feb  2 04:42:48 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 94 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=94 pruub=8.670993805s) [1] r=-1 lpr=94 pi=[57,94)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 228.754318237s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:48 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 94 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=94 pruub=8.670732498s) [1] r=-1 lpr=94 pi=[57,94)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 228.754318237s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:48 np0005604791 systemd-logind[805]: New session 38 of user zuul.
Feb  2 04:42:48 np0005604791 systemd[1]: Started Session 38 of User zuul.
Feb  2 04:42:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:42:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:49.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:42:49 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Feb  2 04:42:49 np0005604791 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Feb  2 04:42:49 np0005604791 python3.9[89369]: ansible-ansible.legacy.ping Invoked with data=pong
Feb  2 04:42:49 np0005604791 ceph-mon[80115]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb  2 04:42:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Feb  2 04:42:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Feb  2 04:42:49 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Feb  2 04:42:49 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 95 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:49 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 95 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Feb  2 04:42:50 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 96 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] async=[1] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:50 np0005604791 python3.9[89544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:42:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:51.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Feb  2 04:42:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 97 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=95/57 les/c/f=96/58/0 sis=97 pruub=15.609633446s) [1] async=[1] r=-1 lpr=97 pi=[57,97)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 238.121505737s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:51 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 97 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=95/57 les/c/f=96/58/0 sis=97 pruub=15.609547615s) [1] r=-1 lpr=97 pi=[57,97)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 238.121505737s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:42:52 np0005604791 python3.9[89725]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:42:52 np0005604791 systemd[83549]: Starting Mark boot as successful...
Feb  2 04:42:52 np0005604791 systemd[83549]: Finished Mark boot as successful.
Feb  2 04:42:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Feb  2 04:42:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:42:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:52.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:42:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:52 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:53.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:53 np0005604791 python3.9[89898]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:42:54 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Feb  2 04:42:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:42:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Feb  2 04:42:54 np0005604791 python3.9[90059]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:42:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:42:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:42:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:54 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 99 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=99 pruub=10.741518021s) [1] r=-1 lpr=99 pi=[57,99)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 236.754623413s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:54 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 99 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=99 pruub=10.741458893s) [1] r=-1 lpr=99 pi=[57,99)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 236.754623413s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:42:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:42:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:42:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:55.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:42:55 np0005604791 python3.9[90212]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:42:55 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Feb  2 04:42:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Feb  2 04:42:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 100 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:55 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 100 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:42:55 np0005604791 python3.9[90362]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:42:55 np0005604791 network[90379]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:42:55 np0005604791 network[90380]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:42:55 np0005604791 network[90381]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:42:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=101 pruub=9.382583618s) [1] r=-1 lpr=101 pi=[57,101)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 236.764617920s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=101 pruub=9.382432938s) [1] r=-1 lpr=101 pi=[57,101)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 236.764617920s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] async=[1] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=100/57 les/c/f=101/58/0 sis=102 pruub=15.896315575s) [1] async=[1] r=-1 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 243.404632568s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=100/57 les/c/f=101/58/0 sis=102 pruub=15.896131516s) [1] r=-1 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 243.404632568s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:56 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb  2 04:42:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:57.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:57 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Feb  2 04:42:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Feb  2 04:42:57 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 103 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] async=[1] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:42:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:42:58 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Feb  2 04:42:58 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Feb  2 04:42:58 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 104 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=102/57 les/c/f=103/58/0 sis=104 pruub=14.986758232s) [1] async=[1] r=-1 lpr=104 pi=[57,104)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 244.532653809s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:42:58 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 104 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=102/57 les/c/f=103/58/0 sis=104 pruub=14.986701012s) [1] r=-1 lpr=104 pi=[57,104)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 244.532653809s@ mbc={}] state<Start>: transitioning to Stray
Feb  2 04:42:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:42:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:42:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:42:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:59.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:42:59 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Feb  2 04:42:59 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Feb  2 04:43:00 np0005604791 python3.9[90643]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:43:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Feb  2 04:43:00 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Feb  2 04:43:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:00.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:43:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:00 np0005604791 python3.9[90794]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:43:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:01.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Feb  2 04:43:02 np0005604791 python3.9[90948]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:43:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Feb  2 04:43:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Feb  2 04:43:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:03.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:03 np0005604791 python3.9[91107]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:43:03 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Feb  2 04:43:03 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Feb  2 04:43:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094303 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:43:04 np0005604791 python3.9[91191]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:43:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:04.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:04 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Feb  2 04:43:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:05.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:43:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Feb  2 04:43:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:06.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Feb  2 04:43:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:07.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:08.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:09 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:09.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Feb  2 04:43:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Feb  2 04:43:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:43:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:10.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Feb  2 04:43:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Feb  2 04:43:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:11 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:11.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Feb  2 04:43:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Feb  2 04:43:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:12.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:13.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:13 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Feb  2 04:43:13 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Feb  2 04:43:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:14.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:14 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Feb  2 04:43:14 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Feb  2 04:43:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:15 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:15.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:43:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Feb  2 04:43:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:16.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:16 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Feb  2 04:43:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Feb  2 04:43:16 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 117 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=117) [0] r=0 lpr=117 pi=[82,117)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:17 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:17.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Feb  2 04:43:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Feb  2 04:43:17 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 118 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=118) [0]/[2] r=-1 lpr=118 pi=[82,118)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:17 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 118 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=118) [0]/[2] r=-1 lpr=118 pi=[82,118)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:43:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:18.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:19 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Feb  2 04:43:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:19.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:19 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Feb  2 04:43:19 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 119 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=119) [0] r=0 lpr=119 pi=[84,119)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Feb  2 04:43:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Feb  2 04:43:20 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:20 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:20 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:43:20 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb  2 04:43:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:20.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:21 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:21.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Feb  2 04:43:21 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:43:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Feb  2 04:43:21 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:21 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:22.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Feb  2 04:43:22 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:43:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:23 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:23.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Feb  2 04:43:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Feb  2 04:43:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:24.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:25 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:25.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Feb  2 04:43:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Feb  2 04:43:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:25 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:43:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Feb  2 04:43:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:26.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Feb  2 04:43:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:27 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:27.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Feb  2 04:43:27 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:27 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Feb  2 04:43:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:28.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:28 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Feb  2 04:43:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Feb  2 04:43:28 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:43:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:29.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:29 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Feb  2 04:43:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Feb  2 04:43:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:30.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Feb  2 04:43:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:31 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:31.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Feb  2 04:43:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Feb  2 04:43:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:32.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:33.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094333 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:43:33 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Feb  2 04:43:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Feb  2 04:43:33 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:34.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:34 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Feb  2 04:43:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Feb  2 04:43:35 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:35 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:43:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:35.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Feb  2 04:43:36 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:36 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb  2 04:43:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Feb  2 04:43:36 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:36 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 04:43:36 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:36 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:36.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a330 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Feb  2 04:43:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:37.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Feb  2 04:43:37 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:43:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Feb  2 04:43:38 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 04:43:38 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 04:43:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:38.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:43:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:39.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:43:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Feb  2 04:43:39 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 04:43:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:40.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:41.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:43:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:42.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:43.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:44.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:45.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:43:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:43:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:46.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:47.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:43:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:48.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:49.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:50.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:51.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:51 np0005604791 python3.9[91580]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:43:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:43:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:52.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:43:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:53.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:53 np0005604791 python3.9[91880]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb  2 04:43:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094353 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:43:54 np0005604791 python3.9[92112]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb  2 04:43:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:54.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:55 np0005604791 python3.9[92265]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:43:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:43:56 np0005604791 python3.9[92417]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb  2 04:43:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:56.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:57 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:43:57 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:43:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:57.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:57 np0005604791 python3.9[92570]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:43:58 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:43:58 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:43:58 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:43:58 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:43:58 np0005604791 python3.9[92722]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:43:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:43:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:58 np0005604791 python3.9[92800]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:43:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:43:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:43:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:43:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:59.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:00 np0005604791 python3.9[92953]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:44:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:00.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:01.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:01 np0005604791 python3.9[93108]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb  2 04:44:02 np0005604791 python3.9[93261]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb  2 04:44:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:44:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:44:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:03.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:03 np0005604791 python3.9[93440]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:44:04 np0005604791 python3.9[93592]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb  2 04:44:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:04.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:05.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:05 np0005604791 python3.9[93745]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:06.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:07.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:07 np0005604791 python3.9[93899]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:44:08 np0005604791 python3.9[94051]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:44:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:08.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:08 np0005604791 python3.9[94129]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:44:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:09 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:09.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:09 np0005604791 python3.9[94284]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:44:09 np0005604791 python3.9[94362]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:44:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:10.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:11 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:11 np0005604791 python3.9[94515]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:11.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.018688) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453018734, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2994, "num_deletes": 251, "total_data_size": 8909660, "memory_usage": 9050384, "flush_reason": "Manual Compaction"}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453052326, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5479434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7760, "largest_seqno": 10749, "table_properties": {"data_size": 5466280, "index_size": 8496, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3589, "raw_key_size": 30984, "raw_average_key_size": 21, "raw_value_size": 5438333, "raw_average_value_size": 3854, "num_data_blocks": 372, "num_entries": 1411, "num_filter_entries": 1411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025339, "oldest_key_time": 1770025339, "file_creation_time": 1770025453, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 33689 microseconds, and 6790 cpu microseconds.
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.052379) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5479434 bytes OK
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.052401) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058432) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058457) EVENT_LOG_v1 {"time_micros": 1770025453058450, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058478) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8895363, prev total WAL file size 8895363, number of live WAL files 2.
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.059997) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5351KB)], [18(11MB)]
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453060054, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17908171, "oldest_snapshot_seqno": -1}
Feb  2 04:44:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4077 keys, 14270142 bytes, temperature: kUnknown
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453147093, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14270142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14237502, "index_size": 21330, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 104104, "raw_average_key_size": 25, "raw_value_size": 14157384, "raw_average_value_size": 3472, "num_data_blocks": 916, "num_entries": 4077, "num_filter_entries": 4077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025453, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.147412) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14270142 bytes
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.148704) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.5 rd, 163.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.2, 11.9 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4608, records dropped: 531 output_compression: NoCompression
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.148734) EVENT_LOG_v1 {"time_micros": 1770025453148720, "job": 8, "event": "compaction_finished", "compaction_time_micros": 87159, "compaction_time_cpu_micros": 19720, "output_level": 6, "num_output_files": 1, "total_output_size": 14270142, "num_input_records": 4608, "num_output_records": 4077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453149812, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453151575, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.059912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:44:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:44:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:13.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:44:13 np0005604791 python3.9[94692]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:44:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:14 np0005604791 python3.9[94844]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb  2 04:44:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003e30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:14.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:15 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:15 np0005604791 python3.9[94995]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:44:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:16 np0005604791 python3.9[95147]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:44:16 np0005604791 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb  2 04:44:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:16 np0005604791 systemd[1]: tuned.service: Deactivated successfully.
Feb  2 04:44:16 np0005604791 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb  2 04:44:16 np0005604791 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb  2 04:44:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:16.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:17 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:17 np0005604791 systemd[1]: Started Dynamic System Tuning Daemon.
Feb  2 04:44:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:17 np0005604791 python3.9[95312]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb  2 04:44:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:18.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:19 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:19.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:20.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:21 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:21.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:21 np0005604791 python3.9[95466]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:44:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094421 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:44:22 np0005604791 python3.9[95620]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:44:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:22.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:23 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:23.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:23 np0005604791 systemd[1]: session-38.scope: Deactivated successfully.
Feb  2 04:44:23 np0005604791 systemd[1]: session-38.scope: Consumed 59.787s CPU time.
Feb  2 04:44:23 np0005604791 systemd-logind[805]: Session 38 logged out. Waiting for processes to exit.
Feb  2 04:44:23 np0005604791 systemd-logind[805]: Removed session 38.
Feb  2 04:44:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:24.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:25 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:26.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:27 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:28 np0005604791 systemd-logind[805]: New session 39 of user zuul.
Feb  2 04:44:28 np0005604791 systemd[1]: Started Session 39 of User zuul.
Feb  2 04:44:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:28.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:29 np0005604791 python3.9[95804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:44:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:44:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:44:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:44:31 np0005604791 python3.9[95961]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb  2 04:44:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:31 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:32 np0005604791 python3.9[96139]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:44:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:32 np0005604791 python3.9[96223]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb  2 04:44:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:44:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:44:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:32.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:34.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:35 np0005604791 python3.9[96380]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000054s ======
Feb  2 04:44:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Feb  2 04:44:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:44:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:36.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:37.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:37 np0005604791 python3.9[96536]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:44:38 np0005604791 python3.9[96691]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:44:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:39.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:39 np0005604791 python3.9[96846]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb  2 04:44:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:40 np0005604791 python3.9[96998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:44:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:41.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:41.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:41 np0005604791 python3.9[97157]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094441 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:44:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:43.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:43.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:43 np0005604791 python3.9[97315]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:44:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:45.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:45.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:45 np0005604791 python3.9[97605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb  2 04:44:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:46 np0005604791 python3.9[97757]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:44:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:44:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:47.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:44:47 np0005604791 python3.9[97912]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:47.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:49.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:49 np0005604791 python3.9[98069]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:44:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:49.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:51.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:51 np0005604791 python3.9[98225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:44:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:51.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:52 np0005604791 python3.9[98406]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Feb  2 04:44:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:53.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:53.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:53 np0005604791 systemd[1]: session-39.scope: Deactivated successfully.
Feb  2 04:44:53 np0005604791 systemd[1]: session-39.scope: Consumed 16.376s CPU time.
Feb  2 04:44:53 np0005604791 systemd-logind[805]: Session 39 logged out. Waiting for processes to exit.
Feb  2 04:44:53 np0005604791 systemd-logind[805]: Removed session 39.
Feb  2 04:44:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:44:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:55.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:44:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:55.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:44:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:57.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:57.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:58 np0005604791 systemd-logind[805]: New session 40 of user zuul.
Feb  2 04:44:58 np0005604791 systemd[1]: Started Session 40 of User zuul.
Feb  2 04:44:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:44:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:59.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:44:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:44:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:44:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:44:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:59.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:44:59 np0005604791 python3.9[98599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:45:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:01.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:01 np0005604791 python3.9[98754]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:45:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:01.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094501 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:45:02 np0005604791 python3.9[98949]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:45:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:02 np0005604791 systemd[1]: session-40.scope: Deactivated successfully.
Feb  2 04:45:02 np0005604791 systemd[1]: session-40.scope: Consumed 2.315s CPU time.
Feb  2 04:45:02 np0005604791 systemd-logind[805]: Session 40 logged out. Waiting for processes to exit.
Feb  2 04:45:02 np0005604791 systemd-logind[805]: Removed session 40.
Feb  2 04:45:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:03.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:03.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:05.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:05.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00042d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:07.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:07 np0005604791 kernel: ganesha.nfsd[97992]: segfault at 50 ip 00007fd36de9d32e sp 00007fd2d67fb210 error 4 in libntirpc.so.5.8[7fd36de82000+2c000] likely on CPU 1 (core 0, socket 1)
Feb  2 04:45:07 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:45:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy ignored for local
Feb  2 04:45:07 np0005604791 systemd[1]: Started Process Core Dump (PID 99069/UID 0).
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:45:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:45:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:07.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:45:08 np0005604791 systemd-coredump[99070]: Process 87470 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 66:#012#0  0x00007fd36de9d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fd36dea7900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Feb  2 04:45:08 np0005604791 systemd[1]: systemd-coredump@2-99069-0.service: Deactivated successfully.
Feb  2 04:45:08 np0005604791 podman[99075]: 2026-02-02 09:45:08.098645896 +0000 UTC m=+0.034704793 container died fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Feb  2 04:45:08 np0005604791 systemd[1]: var-lib-containers-storage-overlay-5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710-merged.mount: Deactivated successfully.
Feb  2 04:45:08 np0005604791 podman[99075]: 2026-02-02 09:45:08.193466296 +0000 UTC m=+0.129525233 container remove fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:45:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:45:08 np0005604791 ceph-mon[80115]: Health check update: 3 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb  2 04:45:08 np0005604791 systemd-logind[805]: New session 41 of user zuul.
Feb  2 04:45:08 np0005604791 systemd[1]: Started Session 41 of User zuul.
Feb  2 04:45:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:45:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.348s CPU time.
Feb  2 04:45:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094508 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:45:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:09.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:09.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:09 np0005604791 python3.9[99277]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:45:10 np0005604791 python3.9[99433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:45:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:11.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:11.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:11 np0005604791 python3.9[99592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:45:12 np0005604791 python3.9[99701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:45:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:45:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:13.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:45:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094513 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:45:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094513 (4) : haproxy version is 2.3.17-d1c9119
Feb  2 04:45:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094513 (4) : path to executable is /usr/local/sbin/haproxy
Feb  2 04:45:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [ALERT] 032/094513 (4) : backend 'backend' has no server available!
Feb  2 04:45:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:13.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:14 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:14 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:45:14 np0005604791 python3.9[99882]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:45:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:15.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:15.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:15 np0005604791 systemd[1]: session-19.scope: Deactivated successfully.
Feb  2 04:45:15 np0005604791 systemd[1]: session-19.scope: Consumed 7.909s CPU time.
Feb  2 04:45:15 np0005604791 systemd-logind[805]: Session 19 logged out. Waiting for processes to exit.
Feb  2 04:45:15 np0005604791 systemd-logind[805]: Removed session 19.
Feb  2 04:45:16 np0005604791 python3.9[100080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:17.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:17 np0005604791 python3.9[100235]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:45:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:17.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:18 np0005604791 python3.9[100400]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:45:18 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 3.
Feb  2 04:45:18 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:45:18 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.348s CPU time.
Feb  2 04:45:18 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:45:18 np0005604791 podman[100531]: 2026-02-02 09:45:18.571943222 +0000 UTC m=+0.047279958 container create 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Feb  2 04:45:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:45:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:45:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:45:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:45:18 np0005604791 podman[100531]: 2026-02-02 09:45:18.548095428 +0000 UTC m=+0.023432174 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:45:18 np0005604791 podman[100531]: 2026-02-02 09:45:18.642048584 +0000 UTC m=+0.117385320 container init 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Feb  2 04:45:18 np0005604791 podman[100531]: 2026-02-02 09:45:18.653230461 +0000 UTC m=+0.128567197 container start 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:45:18 np0005604791 python3.9[100500]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:18 np0005604791 bash[100531]: 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:45:18 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:45:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:45:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:45:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:19.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:45:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:19.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:19 np0005604791 python3.9[100740]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:45:19 np0005604791 python3.9[100820]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:45:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:20 np0005604791 python3.9[100972]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:45:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:21.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:21.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:21 np0005604791 python3.9[101127]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:45:22 np0005604791 python3.9[101279]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:45:22 np0005604791 python3.9[101433]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:45:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:23.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:23 np0005604791 python3.9[101586]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:45:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094523 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:45:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb  2 04:45:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb  2 04:45:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:45:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:45:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb  2 04:45:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:25.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:26 np0005604791 python3.9[101744]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:45:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:45:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:45:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:45:26 np0005604791 python3.9[101899]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:45:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:27.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:27.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:27 np0005604791 python3.9[102053]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:45:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094528 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:45:28 np0005604791 python3.9[102205]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:45:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:29.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:29.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:29 np0005604791 python3.9[102361]: ansible-service_facts Invoked
Feb  2 04:45:29 np0005604791 network[102378]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:45:29 np0005604791 network[102379]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:45:29 np0005604791 network[102380]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:45:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:31.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:31.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000a:nfs.cephfs.0: -2
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1f0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:33.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:33 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:33.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:34 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:34 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:35.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094535 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:45:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:35 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:35.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:36 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:36 np0005604791 python3.9[102884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:45:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:36 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:37.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:37 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:37.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:38 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:38 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:39.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:39 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:39.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:39 np0005604791 python3.9[103041]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb  2 04:45:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:40 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:40 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:41.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:41 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:41 np0005604791 python3.9[103196]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:45:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:41.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:41 np0005604791 python3.9[103276]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:42 np0005604791 python3.9[103428]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:45:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:42 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:42 np0005604791 python3.9[103506]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:42 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:43.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:43 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:43.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:44 np0005604791 python3.9[103663]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:44 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:44 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:45:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:45.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:45:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:45 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:45.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:46 np0005604791 python3.9[103818]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:45:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:46 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:46 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:47.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:47 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:47.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:47 np0005604791 python3.9[103905]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:45:48 np0005604791 systemd[1]: session-41.scope: Deactivated successfully.
Feb  2 04:45:48 np0005604791 systemd[1]: session-41.scope: Consumed 22.798s CPU time.
Feb  2 04:45:48 np0005604791 systemd-logind[805]: Session 41 logged out. Waiting for processes to exit.
Feb  2 04:45:48 np0005604791 systemd-logind[805]: Removed session 41.
Feb  2 04:45:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:48 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:48 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:49.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:49 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:49.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:50 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:50 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:51.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:51 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb  2 04:45:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:51.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb  2 04:45:52 np0005604791 systemd[83549]: Created slice User Background Tasks Slice.
Feb  2 04:45:52 np0005604791 systemd[83549]: Starting Cleanup of User's Temporary Files and Directories...
Feb  2 04:45:52 np0005604791 systemd[83549]: Finished Cleanup of User's Temporary Files and Directories.
Feb  2 04:45:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:52 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:52 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:53.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:53 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:45:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:45:53 np0005604791 systemd-logind[805]: New session 42 of user zuul.
Feb  2 04:45:53 np0005604791 systemd[1]: Started Session 42 of User zuul.
Feb  2 04:45:54 np0005604791 python3.9[104126]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:54 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:54 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:55.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:55 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:55.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:55 np0005604791 python3.9[104279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:45:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:45:55 np0005604791 python3.9[104359]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:45:56 np0005604791 systemd[1]: session-42.scope: Deactivated successfully.
Feb  2 04:45:56 np0005604791 systemd[1]: session-42.scope: Consumed 1.420s CPU time.
Feb  2 04:45:56 np0005604791 systemd-logind[805]: Session 42 logged out. Waiting for processes to exit.
Feb  2 04:45:56 np0005604791 systemd-logind[805]: Removed session 42.
Feb  2 04:45:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:56 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:56 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:57.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:57 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:57.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:58 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:58 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:59.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:45:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:59 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:45:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:45:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:45:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:59.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:00 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:00 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:01.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:01 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:01.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:01 np0005604791 systemd-logind[805]: New session 43 of user zuul.
Feb  2 04:46:01 np0005604791 systemd[1]: Started Session 43 of User zuul.
Feb  2 04:46:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:02 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:02 np0005604791 python3.9[104549]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:46:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:02 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:03.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:03 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:03.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:03 np0005604791 python3.9[104706]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:04 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:04 np0005604791 python3.9[104886]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:04 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:05.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:05 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1cc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:05 np0005604791 python3.9[104964]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0juwud0s recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:05.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094605 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:46:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:06 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:06 np0005604791 python3.9[105118]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:06 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:07.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:07 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy ignored for local
Feb  2 04:46:07 np0005604791 kernel: ganesha.nfsd[102536]: segfault at 50 ip 00007fe2786d032e sp 00007fe1e4ff8210 error 4 in libntirpc.so.5.8[7fe2786b5000+2c000] likely on CPU 2 (core 0, socket 2)
Feb  2 04:46:07 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:46:07 np0005604791 systemd[1]: Started Process Core Dump (PID 105198/UID 0).
Feb  2 04:46:07 np0005604791 python3.9[105197]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.bx0j27wq recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:07.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:08 np0005604791 systemd-coredump[105199]: Process 100550 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007fe2786d032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fe2786da900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Feb  2 04:46:08 np0005604791 systemd[1]: systemd-coredump@3-105198-0.service: Deactivated successfully.
Feb  2 04:46:08 np0005604791 python3.9[105353]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:46:08 np0005604791 podman[105358]: 2026-02-02 09:46:08.140773729 +0000 UTC m=+0.029216273 container died 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:46:08 np0005604791 systemd[1]: var-lib-containers-storage-overlay-7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d-merged.mount: Deactivated successfully.
Feb  2 04:46:08 np0005604791 podman[105358]: 2026-02-02 09:46:08.194327833 +0000 UTC m=+0.082770367 container remove 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb  2 04:46:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:46:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:46:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.070s CPU time.
Feb  2 04:46:08 np0005604791 python3.9[105554]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:09.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:09 np0005604791 python3.9[105634]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:46:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:09 np0005604791 python3.9[105786]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:10 np0005604791 python3.9[105864]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:46:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:11.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:11 np0005604791 python3.9[106017]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:11.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:12 np0005604791 python3.9[106171]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094613 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:46:13 np0005604791 python3.9[106252]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:13.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:14 np0005604791 python3.9[106512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:46:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:46:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:46:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:46:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:15.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:15 np0005604791 python3.9[106591]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:46:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:15.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:46:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:16 np0005604791 python3.9[106745]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:46:16 np0005604791 systemd[1]: Reloading.
Feb  2 04:46:16 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:46:16 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:46:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:17.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:17.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:17 np0005604791 python3.9[106938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:18 np0005604791 python3.9[107016]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:18 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 4.
Feb  2 04:46:18 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:46:18 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.070s CPU time.
Feb  2 04:46:18 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:46:18 np0005604791 podman[107220]: 2026-02-02 09:46:18.790941664 +0000 UTC m=+0.052598199 container create 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:46:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:46:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:46:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:46:18 np0005604791 python3.9[107201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:18 np0005604791 podman[107220]: 2026-02-02 09:46:18.768580936 +0000 UTC m=+0.030237451 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:46:18 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:46:18 np0005604791 podman[107220]: 2026-02-02 09:46:18.877639915 +0000 UTC m=+0.139296440 container init 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Feb  2 04:46:18 np0005604791 podman[107220]: 2026-02-02 09:46:18.881632252 +0000 UTC m=+0.143288777 container start 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:46:18 np0005604791 bash[107220]: 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:46:18 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:46:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:46:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:19.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:19 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:46:19 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:46:19 np0005604791 python3.9[107355]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:19.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:20 np0005604791 python3.9[107532]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:46:20 np0005604791 systemd[1]: Reloading.
Feb  2 04:46:20 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:46:20 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:46:20 np0005604791 systemd[1]: Starting Create netns directory...
Feb  2 04:46:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:20 np0005604791 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb  2 04:46:20 np0005604791 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb  2 04:46:20 np0005604791 systemd[1]: Finished Create netns directory.
Feb  2 04:46:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:21.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:21.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:21 np0005604791 python3.9[107726]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:46:21 np0005604791 network[107743]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:46:21 np0005604791 network[107744]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:46:21 np0005604791 network[107745]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:46:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:23.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:23.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:46:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:46:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:25.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:25.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:26 np0005604791 python3.9[108015]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:26 np0005604791 python3.9[108095]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:27 np0005604791 python3.9[108248]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:27.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:27.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:27 np0005604791 python3.9[108402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094627 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:46:28 np0005604791 python3.9[108480]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:29.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:29.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:29 np0005604791 python3.9[108635]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb  2 04:46:29 np0005604791 systemd[1]: Starting Time & Date Service...
Feb  2 04:46:29 np0005604791 systemd[1]: Started Time & Date Service.
Feb  2 04:46:30 np0005604791 python3.9[108793]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000c:nfs.cephfs.0: -2
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:46:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:31.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:31 np0005604791 python3.9[108958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:31.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:31 np0005604791 python3.9[109042]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:32 np0005604791 python3.9[109194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:32 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:46:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:32 np0005604791 python3.9[109273]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.owyvyunj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:33.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094633 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:46:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:33.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:33 np0005604791 python3.9[109453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:34 np0005604791 python3.9[109531]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:35 np0005604791 python3.9[109684]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:46:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:36 np0005604791 python3[109839]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb  2 04:46:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:36 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:37.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:37 np0005604791 python3.9[109994]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:37.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:37 np0005604791 python3.9[110072]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:38 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:38 np0005604791 python3.9[110227]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:39.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:39.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:39 np0005604791 python3.9[110352]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025598.0979018-895-242187411062551/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:40 np0005604791 python3.9[110506]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:40 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:40 np0005604791 python3.9[110584]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:41.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:41.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:41 np0005604791 python3.9[110739]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:42 np0005604791 python3.9[110817]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:42 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:43 np0005604791 python3.9[110972]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:43.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:43.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:43 np0005604791 python3.9[111050]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:44 np0005604791 python3.9[111202]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:46:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:44 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb80016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:44 np0005604791 python3.9[111360]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:45.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:45.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:45 np0005604791 python3.9[111514]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:46 np0005604791 python3.9[111666]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:46 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:47.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:46:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:47.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:46:47 np0005604791 python3.9[111821]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb  2 04:46:48 np0005604791 python3.9[111973]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb  2 04:46:48 np0005604791 systemd[1]: session-43.scope: Deactivated successfully.
Feb  2 04:46:48 np0005604791 systemd[1]: session-43.scope: Consumed 26.026s CPU time.
Feb  2 04:46:48 np0005604791 systemd-logind[805]: Session 43 logged out. Waiting for processes to exit.
Feb  2 04:46:48 np0005604791 systemd-logind[805]: Removed session 43.
Feb  2 04:46:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:48 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb80021e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:49.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:46:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:49.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:46:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:50 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:51.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:51.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:52 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:46:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:53.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:46:54 np0005604791 systemd-logind[805]: New session 44 of user zuul.
Feb  2 04:46:54 np0005604791 systemd[1]: Started Session 44 of User zuul.
Feb  2 04:46:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:54 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:54 np0005604791 python3.9[112193]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb  2 04:46:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:55.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:46:55 np0005604791 python3.9[112347]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:46:56 np0005604791 python3.9[112501]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Feb  2 04:46:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:56 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:57 np0005604791 python3.9[112656]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.j21xtwvl follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:46:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:46:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:57.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:46:57 np0005604791 python3.9[112781]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.j21xtwvl mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025616.8048086-103-189466703170045/.source.j21xtwvl _original_basename=.ztnzdaqt follow=False checksum=0e76d40d6d80e8dcbe1329e9f4d8b9bf39ee9960 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:46:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:58 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:59 np0005604791 python3.9[112936]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:46:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:46:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:46:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:46:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:59.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:46:59 np0005604791 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb  2 04:46:59 np0005604791 python3.9[113090]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpaaLVd9Gqbxcksz46sKNkp3Eu2TY3fUjtOhbkLQru93qJt/RNDTocNiUrE9VAj/UXp9dZqSHg1Hr7ScqXu7zqgZ9i+mq6N7P7QR+ZkN8jLQSybnPztI7X/QWaPhT0j1ArMrYk2F2Me+kAQiFL0GoR2d8udRElL8YKKIYQ6zjC/h2ZsU0WyVET9uiTgeMP/njtMzRSgO2Wp6no4KqJEOMSEY1lgURjVsMWkTr4hGz523SooA41GzquuNamnj1ELwKZSAH+TtVgI8oFJ2T+5TZiE/oW2MizbBwjKA3V5DlnGOEG49eG+LhZ/eWb6jQ7OnJARA/iLU/FsJ+CaGSbRK20/OWXP4JSZu7liaD0DIHM0DwrjEnQcXI6SbfAoAQ494KFtZvFamem7CPtrVhgNAKqybRbDcEQGpDxQgrWeA3m4HyGIBym+IvMUfYlNke9frCkwNpXRH93TK6E/ziPFrBHKkdRcFxVdsG2u1Y+adxOQk7KCjq/skzXBPCPDaHnzBM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIKtQmhiX/LRkxZONUn47u07V1HNePVW1EWKmTbmuGuY#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE0cPV3BwiB9Cc5Ne48bCCSZwMzF/hH7iFXwAiP/TK2pzWYsdZw1mOSJ+vDu1KclkDtQKmwN6Cu0N7j7domqlzE=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXvxaVTYbHTHv+9EzKdF3T8+Yr2otW2YLuSqNTF+yJaKACfB7wDlIhKDGTHiU1FDrkO4tJ+R3OL/2ZXoIlxp5JSdCgcb42X+5PTj1wPkayVlQW7e0wQvT3kYhrcPtjLgk4T39/sionMGYUat45idwoB6hUSPLdk/L5+n0/3LEg1lByOM/B1/p8wGzHn6H9CWoIP3Ctd6lmrxtIVU1u+pxiBVQCcMjw5gtqsB54l670fL7El5XEkqjRjKHhylw9QTYN3AWMKuQKwcjClm/57/SoFMP7o52r653wGDH9cpvDgs0RYG4bA1mGY5OMkYbDJfcy0CViKEu5qWW4cTBLh/Z88D2EuNlINj3Q1YJk3RwF6vYl31MMsbBW10YhIiBJrA5XF0BLARqBOZ1e6v7JKTSwa7wGGtRzEzbY+me9zl6ZhhDru/I+h24J4MeBA07HvQIS2v8O95tPz76YZJ3DkWlywFWbALG8M4+fkpuQtvVpBZMgdvIWW0kfXO/grGnrgY8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG3OEs+fDFWrKRKifY4uXYtOpS/6/8E88qPQNs1apj/z#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFy9hRh0QDNcy30491f4FwmL+9BopSuPxbkVyWhY9VytT/FG5rm9/DLYyukpd9IKttcZyerq0gzfokDrht76FB4=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDTA16t8OsOL4s99BOiNF3vckRPwnc9DwrgEMUjNAF5ofBbR7O7JlFD47GnI33lZr51vVc0wnvTxhpFA0jVvhKqVWdJ3lApNf34bJmaJBr8uiy/i3Q84MsUtXBLQ0FDCbwgaPnreNbMz3ae+u9H+Z73jQSP+gnQ5oYWhONHgO4HHkF8K7a8Bow3H5qwfbHz8o7mFQmTpYHwOcwhA53BTbh1NiEJZJNSg7wi1hH7vELUAzts1cbF2slTE0nh8XjMogq9ukokrCIKfE+xX7PmAawCuMnfvGX93zF1298pGcUKqvpnIfUOMDGtJtYEZ8sWsr5aH1YXIoJfHuux/YosRx3XDD5oEcpX0nYKVW6bumHsFIS199XAM5LtWWNr2eMcrbZhVwHNdELC6zoL7QjbBQ+2j/+8nJLq9vIghewgO3EFWK3r7kIVQZg8GYLZ/yisH4cvzUTACRXAF+1o2rq+AUfX3nTSsrqyZQUwlnWpc1vsceEO0Lsuac5tvGylnsJBfmM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN317jbKb2FNELHPgcKtyDLq5kCgCZN/b/8qYDuirt4l#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNpgfrlTfGut7rGFnGEpIiXrs2U1SQK0Fr1bAmmw8notvdnn6jtGfPfwX96hGwcOu4AlAS/i7X7XgbLw573Ooww=#012 create=True mode=0644 path=/tmp/ansible.j21xtwvl state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:00 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:00 np0005604791 python3.9[113245]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.j21xtwvl' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:47:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:01.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:01.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:01 np0005604791 python3.9[113402]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.j21xtwvl state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:02 np0005604791 systemd-logind[805]: Session 44 logged out. Waiting for processes to exit.
Feb  2 04:47:02 np0005604791 systemd[1]: session-44.scope: Deactivated successfully.
Feb  2 04:47:02 np0005604791 systemd[1]: session-44.scope: Consumed 4.503s CPU time.
Feb  2 04:47:02 np0005604791 systemd-logind[805]: Removed session 44.
Feb  2 04:47:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:02 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:03 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:03 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:03.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:04 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:05 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:05 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:05.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:05.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:06 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:07 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:07 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:07.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:07 np0005604791 systemd-logind[805]: New session 45 of user zuul.
Feb  2 04:47:07 np0005604791 systemd[1]: Started Session 45 of User zuul.
Feb  2 04:47:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:07.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:08 np0005604791 python3.9[113594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:47:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:08 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:09 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:09 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:47:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:47:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:09.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:09 np0005604791 python3.9[113753]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb  2 04:47:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:10 np0005604791 python3.9[113909]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:47:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:10 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:11 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:11 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:11.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:11.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:12 np0005604791 python3.9[114065]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:47:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:12 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:13 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:13 np0005604791 python3.9[114221]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:47:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:13 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:13.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:13.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:14 np0005604791 python3.9[114399]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:14 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:15 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:15 np0005604791 systemd[1]: session-45.scope: Deactivated successfully.
Feb  2 04:47:15 np0005604791 systemd[1]: session-45.scope: Consumed 3.793s CPU time.
Feb  2 04:47:15 np0005604791 systemd-logind[805]: Session 45 logged out. Waiting for processes to exit.
Feb  2 04:47:15 np0005604791 systemd-logind[805]: Removed session 45.
Feb  2 04:47:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:15 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:15.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:15.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.413219) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636413354, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1914, "num_deletes": 250, "total_data_size": 5251899, "memory_usage": 5339912, "flush_reason": "Manual Compaction"}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636429352, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1997941, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10754, "largest_seqno": 12663, "table_properties": {"data_size": 1992244, "index_size": 2837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14181, "raw_average_key_size": 20, "raw_value_size": 1979844, "raw_average_value_size": 2800, "num_data_blocks": 126, "num_entries": 707, "num_filter_entries": 707, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025454, "oldest_key_time": 1770025454, "file_creation_time": 1770025636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16172 microseconds, and 4081 cpu microseconds.
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.429414) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1997941 bytes OK
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.429432) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430707) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430719) EVENT_LOG_v1 {"time_micros": 1770025636430715, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430737) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5243327, prev total WAL file size 5243327, number of live WAL files 2.
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.431627) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1951KB)], [21(13MB)]
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636431713, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16268083, "oldest_snapshot_seqno": -1}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4357 keys, 14293866 bytes, temperature: kUnknown
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636547842, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14293866, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14260716, "index_size": 21136, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10949, "raw_key_size": 110176, "raw_average_key_size": 25, "raw_value_size": 14177084, "raw_average_value_size": 3253, "num_data_blocks": 906, "num_entries": 4357, "num_filter_entries": 4357, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.548324) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14293866 bytes
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.551279) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.9 rd, 122.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.6 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(15.3) write-amplify(7.2) OK, records in: 4784, records dropped: 427 output_compression: NoCompression
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.551319) EVENT_LOG_v1 {"time_micros": 1770025636551301, "job": 10, "event": "compaction_finished", "compaction_time_micros": 116264, "compaction_time_cpu_micros": 33950, "output_level": 6, "num_output_files": 1, "total_output_size": 14293866, "num_input_records": 4784, "num_output_records": 4357, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636551982, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636554920, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.431493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.554997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555007) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:16 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:17 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:17 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:17.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:17.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:19 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:19 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:19.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:19.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:20 np0005604791 systemd-logind[805]: New session 46 of user zuul.
Feb  2 04:47:20 np0005604791 systemd[1]: Started Session 46 of User zuul.
Feb  2 04:47:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:47:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:47:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:47:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:47:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:20 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:21 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:21 np0005604791 python3.9[114673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:47:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:21 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:21.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:21.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:22 np0005604791 python3.9[114831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:47:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:22 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:23 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:23 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:23.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:23 np0005604791 python3.9[114918]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb  2 04:47:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:23.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:25.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:25 np0005604791 python3.9[115072]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:47:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:47:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:47:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:26 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:26 np0005604791 python3.9[115251]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:47:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:27 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:27 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:27.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:27.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:27 np0005604791 python3.9[115402]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:47:28 np0005604791 python3.9[115553]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:47:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:28 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:29 np0005604791 systemd-logind[805]: Session 46 logged out. Waiting for processes to exit.
Feb  2 04:47:29 np0005604791 systemd[1]: session-46.scope: Deactivated successfully.
Feb  2 04:47:29 np0005604791 systemd[1]: session-46.scope: Consumed 5.416s CPU time.
Feb  2 04:47:29 np0005604791 systemd-logind[805]: Removed session 46.
Feb  2 04:47:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:29 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:29 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:29.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:30 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:31.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:31.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:33.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:33.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:34 np0005604791 systemd-logind[805]: New session 47 of user zuul.
Feb  2 04:47:34 np0005604791 systemd[1]: Started Session 47 of User zuul.
Feb  2 04:47:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:35.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:35.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:35 np0005604791 python3.9[115774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:47:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:36 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:37.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:37 np0005604791 python3.9[115933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:37.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:37 np0005604791 python3.9[116087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:38 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:38 np0005604791 python3.9[116239]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:39.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:39.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:39 np0005604791 python3.9[116365]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025658.1468506-149-25167533310985/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=bbbde0bea78cfeba25f07606728ce69c42c7d6f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:40 np0005604791 python3.9[116517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:40 np0005604791 python3.9[116642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025659.7466278-149-59931358464830/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=98d348615e61a9b68b5c5fd470bc9aeb831c56b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:40 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:41.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:41 np0005604791 python3.9[116795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:41.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:41 np0005604791 python3.9[116918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025660.8416958-149-279182533082446/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=42ff746d592f57a5dc4052c4590df75e42f43be8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:42 np0005604791 python3.9[117072]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:42 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:43 np0005604791 python3.9[117225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:47:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:43.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:47:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:43.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:43 np0005604791 python3.9[117379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:44 np0005604791 python3.9[117502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025663.3480186-326-207816344631668/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=453d3b7a2aef7dfcee9fd995557ae6920a7b055b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:44 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:44 np0005604791 python3.9[117657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:45.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:45 np0005604791 python3.9[117780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025664.537104-326-46680391540927/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d08fd1db2672bef6291fde5319a05fae0b3732d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:45.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:46 np0005604791 python3.9[117932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:46 np0005604791 python3.9[118057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025665.5931032-326-25727540431725/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=41662b67015605c8326ed9e90f71bc0a5c935d1a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:46 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:47.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:47 np0005604791 python3.9[118210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:47.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:48 np0005604791 python3.9[118364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:48 np0005604791 python3.9[118516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:48 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:49 np0005604791 python3.9[118640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025668.250418-495-150443268582787/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=cc0f86f89c3e4ae1f8702736ba65f2dc1e3f1c08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:49.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:49.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:49 np0005604791 python3.9[118794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:50 np0005604791 python3.9[118917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025669.3261507-495-62181352058023/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d08fd1db2672bef6291fde5319a05fae0b3732d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:50 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:51 np0005604791 python3.9[119072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:51.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:51 np0005604791 python3.9[119195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025670.5543776-495-184201265955902/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a845be08b1950ef1f3ad8a3b70e4630a68f71b53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:52 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:52 np0005604791 python3.9[119349]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:47:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:47:53 np0005604791 python3.9[119504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:53.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:54 np0005604791 python3.9[119628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025673.045883-699-60187974682127/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:54 np0005604791 python3.9[119804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:54 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:55.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:55 np0005604791 python3.9[119959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:55.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:47:56 np0005604791 python3.9[120082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025674.9764016-770-260199763306598/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.468882) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676468945, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 656, "num_deletes": 251, "total_data_size": 1200201, "memory_usage": 1215632, "flush_reason": "Manual Compaction"}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676481337, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 787349, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12668, "largest_seqno": 13319, "table_properties": {"data_size": 784120, "index_size": 1137, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7412, "raw_average_key_size": 18, "raw_value_size": 777574, "raw_average_value_size": 1963, "num_data_blocks": 50, "num_entries": 396, "num_filter_entries": 396, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025637, "oldest_key_time": 1770025637, "file_creation_time": 1770025676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12808 microseconds, and 2187 cpu microseconds.
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.481689) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 787349 bytes OK
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.481815) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483332) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483355) EVENT_LOG_v1 {"time_micros": 1770025676483348, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483376) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1196582, prev total WAL file size 1196582, number of live WAL files 2.
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.484636) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(768KB)], [24(13MB)]
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676484671, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15081215, "oldest_snapshot_seqno": -1}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4238 keys, 12202197 bytes, temperature: kUnknown
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676622078, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12202197, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12171406, "index_size": 19097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108602, "raw_average_key_size": 25, "raw_value_size": 12091318, "raw_average_value_size": 2853, "num_data_blocks": 807, "num_entries": 4238, "num_filter_entries": 4238, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.622413) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12202197 bytes
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.623696) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.6 rd, 88.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 13.6 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(34.7) write-amplify(15.5) OK, records in: 4753, records dropped: 515 output_compression: NoCompression
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.623728) EVENT_LOG_v1 {"time_micros": 1770025676623713, "job": 12, "event": "compaction_finished", "compaction_time_micros": 137541, "compaction_time_cpu_micros": 19722, "output_level": 6, "num_output_files": 1, "total_output_size": 12202197, "num_input_records": 4753, "num_output_records": 4238, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676623964, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676626109, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.484549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:47:56 np0005604791 python3.9[120236]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:56 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:47:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:57.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:47:57 np0005604791 python3.9[120389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:57.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:57 np0005604791 python3.9[120514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025676.973876-841-158348192090659/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:47:58 np0005604791 python3.9[120666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:47:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:58 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:47:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:59 np0005604791 python3.9[120821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:47:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:47:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:47:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:59.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:47:59 np0005604791 python3.9[120944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025678.8767219-913-97381034617978/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:00 np0005604791 python3.9[121096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:00 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:01 np0005604791 python3.9[121251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy ignored for local
Feb  2 04:48:01 np0005604791 kernel: ganesha.nfsd[115675]: segfault at 50 ip 00007f0c5e63c32e sp 00007f0bc27fb210 error 4 in libntirpc.so.5.8[7f0c5e621000+2c000] likely on CPU 2 (core 0, socket 2)
Feb  2 04:48:01 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:48:01 np0005604791 systemd[1]: Started Process Core Dump (PID 121252/UID 0).
Feb  2 04:48:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:01.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:01 np0005604791 python3.9[121376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025680.8098073-985-178843502379072/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:02 np0005604791 systemd-coredump[121253]: Process 107242 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007f0c5e63c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 04:48:02 np0005604791 systemd[1]: systemd-coredump@4-121252-0.service: Deactivated successfully.
Feb  2 04:48:02 np0005604791 podman[121462]: 2026-02-02 09:48:02.201834604 +0000 UTC m=+0.026885060 container died 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb  2 04:48:02 np0005604791 systemd[1]: var-lib-containers-storage-overlay-02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f-merged.mount: Deactivated successfully.
Feb  2 04:48:02 np0005604791 podman[121462]: 2026-02-02 09:48:02.239584884 +0000 UTC m=+0.064635320 container remove 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:48:02 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:48:02 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:48:02 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.142s CPU time.
Feb  2 04:48:02 np0005604791 python3.9[121562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:03 np0005604791 python3.9[121728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:03.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:03.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:03 np0005604791 python3.9[121853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025682.6906075-1057-20545773214041/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:05 np0005604791 systemd-logind[805]: Session 47 logged out. Waiting for processes to exit.
Feb  2 04:48:05 np0005604791 systemd[1]: session-47.scope: Deactivated successfully.
Feb  2 04:48:05 np0005604791 systemd[1]: session-47.scope: Consumed 21.079s CPU time.
Feb  2 04:48:05 np0005604791 systemd-logind[805]: Removed session 47.
Feb  2 04:48:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:05.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:05.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094807 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:48:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:07.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:07.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:09.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:09.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:10 np0005604791 systemd-logind[805]: New session 48 of user zuul.
Feb  2 04:48:10 np0005604791 systemd[1]: Started Session 48 of User zuul.
Feb  2 04:48:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:48:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:11.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:48:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:11.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:11 np0005604791 python3.9[122045]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:12 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 5.
Feb  2 04:48:12 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:48:12 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.142s CPU time.
Feb  2 04:48:12 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:48:12 np0005604791 python3.9[122199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:12 np0005604791 podman[122294]: 2026-02-02 09:48:12.791923113 +0000 UTC m=+0.055012941 container create 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:48:12 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:48:12 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:48:12 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:48:12 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:48:12 np0005604791 podman[122294]: 2026-02-02 09:48:12.858533554 +0000 UTC m=+0.121623382 container init 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 04:48:12 np0005604791 podman[122294]: 2026-02-02 09:48:12.767210733 +0000 UTC m=+0.030300601 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:48:12 np0005604791 podman[122294]: 2026-02-02 09:48:12.873273168 +0000 UTC m=+0.136362996 container start 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb  2 04:48:12 np0005604791 bash[122294]: 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5
Feb  2 04:48:12 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:48:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:48:13 np0005604791 python3.9[122429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025691.9857852-58-248018369872718/.source.conf _original_basename=ceph.conf follow=False checksum=d5af35537b3c8ec6eada2ba8657e5bbbf335fb7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:13.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:13.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:13 np0005604791 python3.9[122581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:14 np0005604791 python3.9[122729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025693.507988-58-91701217454511/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=b59eb4ee1ef760db0b0353d13f50139cad503c44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:15 np0005604791 systemd[1]: session-48.scope: Deactivated successfully.
Feb  2 04:48:15 np0005604791 systemd[1]: session-48.scope: Consumed 2.642s CPU time.
Feb  2 04:48:15 np0005604791 systemd-logind[805]: Session 48 logged out. Waiting for processes to exit.
Feb  2 04:48:15 np0005604791 systemd-logind[805]: Removed session 48.
Feb  2 04:48:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:15.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:15.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:48:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:17.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:48:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:48:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:48:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:19.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:20 np0005604791 systemd-logind[805]: New session 49 of user zuul.
Feb  2 04:48:20 np0005604791 systemd[1]: Started Session 49 of User zuul.
Feb  2 04:48:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:21 np0005604791 python3.9[122921]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:48:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:22 np0005604791 python3.9[123079]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:23 np0005604791 python3.9[123232]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:23.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:23 np0005604791 python3.9[123384]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:48:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094824 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:48:24 np0005604791 python3.9[123536]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:25.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:25.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:25 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb  2 04:48:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:27 np0005604791 python3.9[123795]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:48:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:27 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:48:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094827 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:48:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:48:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:27.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:48:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:27.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:27 np0005604791 python3.9[123879]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:48:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:29.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:29.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:30 np0005604791 python3.9[124037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:48:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:31.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:31 np0005604791 python3[124195]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb  2 04:48:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:31.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:48:32 np0005604791 python3.9[124347]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:48:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:33 np0005604791 python3.9[124527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:33.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:33.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:33 np0005604791 python3.9[124605]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:34 np0005604791 python3.9[124782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:34 np0005604791 python3.9[124860]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gxzff1_3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:48:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:48:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:35.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:35 np0005604791 python3.9[125015]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:35.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:36 np0005604791 python3.9[125093]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:37 np0005604791 python3.9[125248]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:37.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:37.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:38 np0005604791 python3[125403]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb  2 04:48:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:48:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:38 np0005604791 python3.9[125555]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:39.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:39.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:39 np0005604791 python3.9[125683]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025718.3788354-427-93548668268750/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:40 np0005604791 python3.9[125835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:40 np0005604791 python3.9[125963]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025719.8363638-472-232289786823075/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:41.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:41.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:41 np0005604791 python3.9[126115]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:42 np0005604791 python3.9[126242]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025721.2520907-517-269301735388539/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:43 np0005604791 python3.9[126395]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:43.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:43 np0005604791 python3.9[126520]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025722.599423-562-5699715306639/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094844 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:48:44 np0005604791 python3.9[126674]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:45 np0005604791 python3.9[126800]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025723.9314237-607-263071308645558/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:48:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:45.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:48:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:45 np0005604791 python3.9[126954]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:46 np0005604791 python3.9[127106]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:47 np0005604791 python3.9[127264]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:47.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:48 np0005604791 python3.9[127418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:49 np0005604791 python3.9[127572]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:48:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:49.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:49 np0005604791 python3.9[127728]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:50 np0005604791 python3.9[127883]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:48:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:51.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:51.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:52 np0005604791 python3.9[128036]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:48:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094852 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:48:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:53.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:53 np0005604791 python3.9[128192]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:53 np0005604791 ovs-vsctl[128193]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb  2 04:48:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:54 np0005604791 python3.9[128347]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:55 np0005604791 python3.9[128530]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:48:55 np0005604791 ovs-vsctl[128531]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb  2 04:48:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:48:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:55.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:48:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:48:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:55.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:55 np0005604791 python3.9[128681]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:48:56 np0005604791 python3.9[128837]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:57 np0005604791 python3.9[128992]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:57 np0005604791 python3.9[129072]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:58 np0005604791 python3.9[129224]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:48:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:59 np0005604791 python3.9[129303]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:48:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:48:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:59.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:48:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:48:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:48:59 np0005604791 python3.9[129457]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:00 np0005604791 python3.9[129609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:01 np0005604791 python3.9[129690]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:49:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:01.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:01 np0005604791 python3.9[129842]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:02 np0005604791 python3.9[129922]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:02 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:03 np0005604791 python3.9[130075]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:49:03 np0005604791 systemd[1]: Reloading.
Feb  2 04:49:03 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:49:03 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:49:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:03.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:04 np0005604791 python3.9[130267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:49:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:49:04 np0005604791 python3.9[130345]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:05 np0005604791 python3.9[130500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:05.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:05.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:05 np0005604791 python3.9[130578]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:06 np0005604791 python3.9[130730]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:49:06 np0005604791 systemd[1]: Reloading.
Feb  2 04:49:06 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:49:06 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:49:06 np0005604791 systemd[1]: Starting Create netns directory...
Feb  2 04:49:06 np0005604791 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb  2 04:49:06 np0005604791 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb  2 04:49:06 np0005604791 systemd[1]: Finished Create netns directory.
Feb  2 04:49:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:49:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:07.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:07.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:07 np0005604791 python3.9[130926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:08 np0005604791 python3.9[131080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 04:49:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 21.03 MB, 0.04 MB/s#012Interval WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Feb  2 04:49:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:08 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:09 np0005604791 python3.9[131204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025748.0329247-1360-169299244238092/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:09.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:09.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:10 np0005604791 python3.9[131358]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:10 np0005604791 python3.9[131512]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:11.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:11 np0005604791 python3.9[131665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:11.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:12 np0005604791 python3.9[131788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025751.0971673-1459-92705423630933/.source.json _original_basename=.zs2mzgar follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094912 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:49:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:13.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:13 np0005604791 python3.9[131941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:13.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:14 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8001380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:15.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:15.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:16 np0005604791 python3.9[132392]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb  2 04:49:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:16 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:17 np0005604791 python3.9[132547]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb  2 04:49:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:17.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:17.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:18 np0005604791 python3[132701]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb  2 04:49:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:19.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:21.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:21.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094922 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:49:22 np0005604791 podman[132716]: 2026-02-02 09:49:22.794647422 +0000 UTC m=+4.270110576 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb  2 04:49:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:22 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:22 np0005604791 podman[132841]: 2026-02-02 09:49:22.954558689 +0000 UTC m=+0.075084690 container create 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:49:22 np0005604791 podman[132841]: 2026-02-02 09:49:22.909016647 +0000 UTC m=+0.029542648 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb  2 04:49:22 np0005604791 python3[132701]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb  2 04:49:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:23.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:23.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:24 np0005604791 python3.9[133033]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:49:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:25 np0005604791 python3.9[133190]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:25.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:25 np0005604791 python3.9[133266]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:49:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:25.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:26 np0005604791 python3.9[133419]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025765.5896683-1693-110278002173700/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:26 np0005604791 python3.9[133495]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:49:26 np0005604791 systemd[1]: Reloading.
Feb  2 04:49:26 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:49:26 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:49:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:27.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:49:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:49:27 np0005604791 python3.9[133610]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:49:27 np0005604791 systemd[1]: Reloading.
Feb  2 04:49:27 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:49:27 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:49:27 np0005604791 systemd[1]: Starting ovn_controller container...
Feb  2 04:49:28 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:49:28 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/983c70ab7990a63defec4761ab2164346dd6a5e764615cf7607090ef8aebdfce/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb  2 04:49:28 np0005604791 systemd[1]: Started /usr/bin/podman healthcheck run 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2.
Feb  2 04:49:28 np0005604791 podman[133650]: 2026-02-02 09:49:28.081297474 +0000 UTC m=+0.134897962 container init 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + sudo -E kolla_set_configs
Feb  2 04:49:28 np0005604791 podman[133650]: 2026-02-02 09:49:28.113701566 +0000 UTC m=+0.167302024 container start 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb  2 04:49:28 np0005604791 edpm-start-podman-container[133650]: ovn_controller
Feb  2 04:49:28 np0005604791 systemd[1]: Created slice User Slice of UID 0.
Feb  2 04:49:28 np0005604791 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb  2 04:49:28 np0005604791 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb  2 04:49:28 np0005604791 systemd[1]: Starting User Manager for UID 0...
Feb  2 04:49:28 np0005604791 edpm-start-podman-container[133649]: Creating additional drop-in dependency for "ovn_controller" (1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2)
Feb  2 04:49:28 np0005604791 podman[133673]: 2026-02-02 09:49:28.209005603 +0000 UTC m=+0.087636114 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 04:49:28 np0005604791 systemd[1]: 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2-19dd92c79d235583.service: Main process exited, code=exited, status=1/FAILURE
Feb  2 04:49:28 np0005604791 systemd[1]: 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2-19dd92c79d235583.service: Failed with result 'exit-code'.
Feb  2 04:49:28 np0005604791 systemd[1]: Reloading.
Feb  2 04:49:28 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:49:28 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:49:28 np0005604791 systemd[133706]: Queued start job for default target Main User Target.
Feb  2 04:49:28 np0005604791 systemd[133706]: Created slice User Application Slice.
Feb  2 04:49:28 np0005604791 systemd[133706]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb  2 04:49:28 np0005604791 systemd[133706]: Started Daily Cleanup of User's Temporary Directories.
Feb  2 04:49:28 np0005604791 systemd[133706]: Reached target Paths.
Feb  2 04:49:28 np0005604791 systemd[133706]: Reached target Timers.
Feb  2 04:49:28 np0005604791 systemd[133706]: Starting D-Bus User Message Bus Socket...
Feb  2 04:49:28 np0005604791 systemd[133706]: Starting Create User's Volatile Files and Directories...
Feb  2 04:49:28 np0005604791 systemd[133706]: Finished Create User's Volatile Files and Directories.
Feb  2 04:49:28 np0005604791 systemd[133706]: Listening on D-Bus User Message Bus Socket.
Feb  2 04:49:28 np0005604791 systemd[133706]: Reached target Sockets.
Feb  2 04:49:28 np0005604791 systemd[133706]: Reached target Basic System.
Feb  2 04:49:28 np0005604791 systemd[133706]: Reached target Main User Target.
Feb  2 04:49:28 np0005604791 systemd[133706]: Startup finished in 155ms.
Feb  2 04:49:28 np0005604791 systemd[1]: Started User Manager for UID 0.
Feb  2 04:49:28 np0005604791 systemd[1]: Started ovn_controller container.
Feb  2 04:49:28 np0005604791 systemd[1]: Started Session c1 of User root.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: INFO:__main__:Validating config file
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: INFO:__main__:Writing out command to execute
Feb  2 04:49:28 np0005604791 systemd[1]: session-c1.scope: Deactivated successfully.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: ++ cat /run_command
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + ARGS=
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + sudo kolla_copy_cacerts
Feb  2 04:49:28 np0005604791 systemd[1]: Started Session c2 of User root.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + [[ ! -n '' ]]
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + . kolla_extend_start
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + umask 0022
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb  2 04:49:28 np0005604791 systemd[1]: session-c2.scope: Deactivated successfully.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6234] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6239] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <warn>  [1770025768.6241] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6246] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6250] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6253] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb  2 04:49:28 np0005604791 kernel: br-int: entered promiscuous mode
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00022|main|INFO|OVS feature set changed, force recompute.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb  2 04:49:28 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6446] manager: (ovn-efcb63-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb  2 04:49:28 np0005604791 systemd-udevd[133800]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 04:49:28 np0005604791 kernel: genev_sys_6081: entered promiscuous mode
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6627] device (genev_sys_6081): carrier: link connected
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.6630] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.8043] manager: (ovn-1b0741-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Feb  2 04:49:28 np0005604791 NetworkManager[49055]: <info>  [1770025768.8704] manager: (ovn-031ca0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Feb  2 04:49:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:29.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:29 np0005604791 python3.9[133931]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb  2 04:49:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:49:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:30 np0005604791 python3.9[134085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:31 np0005604791 python3.9[134209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025770.2638068-1828-243749836080261/.source.yaml _original_basename=.30gupeo4 follow=False checksum=49e9dd6dd1573230eefb068866cfd1da40e184ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:49:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:32 np0005604791 python3.9[134363]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:49:32 np0005604791 ovs-vsctl[134364]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb  2 04:49:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:33 np0005604791 python3.9[134597]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:49:33 np0005604791 ovs-vsctl[134643]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb  2 04:49:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:49:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:49:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:33.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:49:34 np0005604791 python3.9[134828]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:49:34 np0005604791 ovs-vsctl[134831]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb  2 04:49:34 np0005604791 systemd[1]: session-49.scope: Deactivated successfully.
Feb  2 04:49:34 np0005604791 systemd[1]: session-49.scope: Consumed 52.942s CPU time.
Feb  2 04:49:34 np0005604791 systemd-logind[805]: Session 49 logged out. Waiting for processes to exit.
Feb  2 04:49:34 np0005604791 systemd-logind[805]: Removed session 49.
Feb  2 04:49:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 04:49:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2411 writes, 14K keys, 2411 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2411 writes, 2411 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2411 writes, 14K keys, 2411 commit groups, 1.0 writes per commit group, ingest: 38.02 MB, 0.06 MB/s#012Interval WAL: 2411 writes, 2411 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    106.8      0.20              0.04         6    0.033       0      0       0.0       0.0#012  L6      1/0   11.64 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.9    102.0     88.5      0.70              0.11         5    0.139     21K   2265       0.0       0.0#012 Sum      1/0   11.64 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9     79.5     92.5      0.89              0.16        11    0.081     21K   2265       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9     79.7     92.7      0.89              0.16        10    0.089     21K   2265       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    102.0     88.5      0.70              0.11         5    0.139     21K   2265       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    107.9      0.19              0.04         5    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.021#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 2.56 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(157,2.36 MB,0.777666%) FilterBlock(11,70.05 KB,0.0225017%) IndexBlock(11,132.27 KB,0.0424887%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Feb  2 04:49:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:35.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:35.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:49:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:37.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:38 np0005604791 systemd[1]: Stopping User Manager for UID 0...
Feb  2 04:49:38 np0005604791 systemd[133706]: Activating special unit Exit the Session...
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped target Main User Target.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped target Basic System.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped target Paths.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped target Sockets.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped target Timers.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped Daily Cleanup of User's Temporary Directories.
Feb  2 04:49:38 np0005604791 systemd[133706]: Closed D-Bus User Message Bus Socket.
Feb  2 04:49:38 np0005604791 systemd[133706]: Stopped Create User's Volatile Files and Directories.
Feb  2 04:49:38 np0005604791 systemd[133706]: Removed slice User Application Slice.
Feb  2 04:49:38 np0005604791 systemd[133706]: Reached target Shutdown.
Feb  2 04:49:38 np0005604791 systemd[133706]: Finished Exit the Session.
Feb  2 04:49:38 np0005604791 systemd[133706]: Reached target Exit the Session.
Feb  2 04:49:38 np0005604791 systemd[1]: user@0.service: Deactivated successfully.
Feb  2 04:49:38 np0005604791 systemd[1]: Stopped User Manager for UID 0.
Feb  2 04:49:38 np0005604791 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb  2 04:49:38 np0005604791 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb  2 04:49:38 np0005604791 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb  2 04:49:38 np0005604791 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb  2 04:49:38 np0005604791 systemd[1]: Removed slice User Slice of UID 0.
Feb  2 04:49:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:49:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 04:49:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 04:49:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:40 np0005604791 systemd-logind[805]: New session 51 of user zuul.
Feb  2 04:49:40 np0005604791 systemd[1]: Started Session 51 of User zuul.
Feb  2 04:49:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:41.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:41.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:41 np0005604791 python3.9[135077]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:49:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094942 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:49:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:49:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:43.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:49:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:44 np0005604791 python3.9[135239]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:44 np0005604791 python3.9[135391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:45 np0005604791 python3.9[135546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:45.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:45.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:45 np0005604791 python3.9[135698]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:46 np0005604791 python3.9[135850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:47 np0005604791 python3.9[136002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:49:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:48 np0005604791 python3.9[136156]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb  2 04:49:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:49.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:49 np0005604791 python3.9[136309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:50 np0005604791 python3.9[136432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025789.1167662-214-80255114617812/.source follow=False _original_basename=haproxy.j2 checksum=35fdf371a5549b7e7e32a6541c07c1ac75cf4dcf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:51 np0005604791 python3.9[136583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:51 np0005604791 python3.9[136704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025790.6593184-259-190124914151455/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:52 np0005604791 python3.9[136858]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:49:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:53.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:53 np0005604791 python3.9[136943]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:49:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:53.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:55.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:49:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:55.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:55 np0005604791 python3.9[137126]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:49:56 np0005604791 python3.9[137281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:57 np0005604791 python3.9[137403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025796.3345869-370-110931356564794/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:57.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:57.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:57 np0005604791 python3.9[137553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:49:58 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:58Z|00025|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Feb  2 04:49:58 np0005604791 ovn_controller[133666]: 2026-02-02T09:49:58Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Feb  2 04:49:58 np0005604791 podman[137650]: 2026-02-02 09:49:58.422710751 +0000 UTC m=+0.120206060 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb  2 04:49:58 np0005604791 python3.9[137687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025797.4807904-370-194460599842836/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:49:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:49:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:49:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:59.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:49:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:49:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:49:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:59.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:49:59 np0005604791 python3.9[137851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:00 np0005604791 python3.9[137974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025799.2823985-502-265633110977756/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:00 np0005604791 ceph-mon[80115]: Health detail: HEALTH_WARN 3 failed cephadm daemon(s)
Feb  2 04:50:00 np0005604791 ceph-mon[80115]: [WRN] CEPHADM_FAILED_DAEMON: 3 failed cephadm daemon(s)
Feb  2 04:50:00 np0005604791 ceph-mon[80115]:    daemon nfs.cephfs.2.0.compute-0.fdwwab on compute-0 is in unknown state
Feb  2 04:50:00 np0005604791 ceph-mon[80115]:    daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1 is in unknown state
Feb  2 04:50:00 np0005604791 ceph-mon[80115]:    daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2 is in unknown state
Feb  2 04:50:00 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:00 np0005604791 python3.9[138124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:01 np0005604791 python3.9[138246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025800.388307-502-11700354517451/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:01.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:02 np0005604791 python3.9[138398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:50:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:02 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:02 np0005604791 python3.9[138553]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:03 np0005604791 python3.9[138707]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:03.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:04 np0005604791 python3.9[138785]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:04 np0005604791 python3.9[138939]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:05 np0005604791 python3.9[139018]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:05.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:05 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:05 np0005604791 python3.9[139170]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:06 np0005604791 python3.9[139322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:07 np0005604791 python3.9[139401]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:07.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:07.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:07 np0005604791 python3.9[139558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:08 np0005604791 python3.9[139636]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:08 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:09 np0005604791 python3.9[139791]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:50:09 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:09.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:09.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:10 np0005604791 python3.9[139983]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095010 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:50:10 np0005604791 python3.9[140061]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:10 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00010d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:11 np0005604791 python3.9[140214]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:11.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:11.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:11 np0005604791 python3.9[140294]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:12 np0005604791 python3.9[140446]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:50:12 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:12 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:12 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:12 np0005604791 systemd[1]: Starting Create netns directory...
Feb  2 04:50:12 np0005604791 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb  2 04:50:12 np0005604791 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb  2 04:50:12 np0005604791 systemd[1]: Finished Create netns directory.
Feb  2 04:50:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0001e80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:13.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:13.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:14 np0005604791 python3.9[140640]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:14 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:15 np0005604791 python3.9[140818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:15.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:15 np0005604791 python3.9[140941]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025814.613436-955-95466646659596/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:15.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:16 np0005604791 python3.9[141095]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:16 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:17 np0005604791 python3.9[141250]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:50:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:17.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:17.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:17 np0005604791 python3.9[141402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:50:18 np0005604791 python3.9[141525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025817.613887-1054-149016092379195/.source.json _original_basename=._0d2hgfi follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095018 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:50:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:19 np0005604791 python3.9[141678]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:19.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:19.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:50:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:21.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:21 np0005604791 python3.9[142104]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb  2 04:50:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:21.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:22 np0005604791 python3.9[142258]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb  2 04:50:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:22 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:23.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:23.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:24 np0005604791 python3[142413]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb  2 04:50:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:25.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:25.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:27.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:27.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:50:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:50:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:29.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:29.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:50:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:31 np0005604791 podman[142503]: 2026-02-02 09:50:31.535776443 +0000 UTC m=+2.769583886 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb  2 04:50:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:31.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:31.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:33.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:33.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:50:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095034 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:50:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:35.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:35.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:50:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:37 np0005604791 podman[142427]: 2026-02-02 09:50:37.315236134 +0000 UTC m=+13.236378666 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 04:50:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:37.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:37 np0005604791 podman[142632]: 2026-02-02 09:50:37.477740864 +0000 UTC m=+0.036223866 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 04:50:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:37.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:37 np0005604791 podman[142632]: 2026-02-02 09:50:37.891633644 +0000 UTC m=+0.450116596 container create 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb  2 04:50:37 np0005604791 python3[142413]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 04:50:38 np0005604791 python3.9[142823]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:50:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:39.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:39 np0005604791 python3.9[143052]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:39.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:40 np0005604791 python3.9[143140]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:50:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:50:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:50:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:50:40 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:50:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095040 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:50:40 np0005604791 python3.9[143291]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025840.1800416-1288-191941218005665/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:41 np0005604791 python3.9[143368]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:50:41 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:41 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:41 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:41.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:41.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:42 np0005604791 python3.9[143479]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:50:42 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:42 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:42 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:42 np0005604791 systemd[1]: Starting ovn_metadata_agent container...
Feb  2 04:50:42 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:50:42 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f162e3e4e4516f70737a87aceaba248606c2e295597c4fa34890d6a1f85ad4a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb  2 04:50:42 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f162e3e4e4516f70737a87aceaba248606c2e295597c4fa34890d6a1f85ad4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb  2 04:50:42 np0005604791 systemd[1]: Started /usr/bin/podman healthcheck run 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d.
Feb  2 04:50:42 np0005604791 podman[143520]: 2026-02-02 09:50:42.877798747 +0000 UTC m=+0.287535974 container init 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: + sudo -E kolla_set_configs
Feb  2 04:50:42 np0005604791 podman[143520]: 2026-02-02 09:50:42.910981631 +0000 UTC m=+0.320718818 container start 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb  2 04:50:42 np0005604791 edpm-start-podman-container[143520]: ovn_metadata_agent
Feb  2 04:50:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Validating config file
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Copying service configuration files
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Writing out command to execute
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: ++ cat /run_command
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: + CMD=neutron-ovn-metadata-agent
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: + ARGS=
Feb  2 04:50:42 np0005604791 ovn_metadata_agent[143537]: + sudo kolla_copy_cacerts
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: + [[ ! -n '' ]]
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: + . kolla_extend_start
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: Running command: 'neutron-ovn-metadata-agent'
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: + umask 0022
Feb  2 04:50:43 np0005604791 ovn_metadata_agent[143537]: + exec neutron-ovn-metadata-agent
Feb  2 04:50:43 np0005604791 podman[143544]: 2026-02-02 09:50:43.01755582 +0000 UTC m=+0.096332477 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb  2 04:50:43 np0005604791 edpm-start-podman-container[143519]: Creating additional drop-in dependency for "ovn_metadata_agent" (9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d)
Feb  2 04:50:43 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:43 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:43 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:43 np0005604791 systemd[1]: Started ovn_metadata_agent container.
Feb  2 04:50:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:50:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:43.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:50:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:43.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 INFO neutron.common.config [-] Logging enabled!#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.888 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.896 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.896 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.908 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2f54a3b0-231a-4b96-9e3a-0a36e3e73216 (UUID: 2f54a3b0-231a-4b96-9e3a-0a36e3e73216) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.929 143542 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.932 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.937 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.943 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2f54a3b0-231a-4b96-9e3a-0a36e3e73216'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], external_ids={}, name=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, nb_cfg_timestamp=1770025776644, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.943 143542 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f9094eeff70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.945 143542 INFO oslo_service.service [-] Starting 1 workers#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.947 143542 DEBUG oslo_service.service [-] Started child 143784 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.950 143542 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpbue2c5ic/privsep.sock']#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.950 143784 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2066399'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Feb  2 04:50:44 np0005604791 python3.9[143783]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.974 143784 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.974 143784 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.975 143784 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.977 143784 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Feb  2 04:50:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.983 143784 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Feb  2 04:50:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.988 143784 INFO eventlet.wsgi.server [-] (143784) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Feb  2 04:50:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:45 np0005604791 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb  2 04:50:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:45.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.591 143542 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.592 143542 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbue2c5ic/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.445 143813 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.452 143813 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.455 143813 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.456 143813 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143813#033[00m
Feb  2 04:50:45 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.596 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[2de41c97-5f96-4b14-be02-f1c369ad3d48]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 04:50:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:45.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:50:46 np0005604791 python3.9[143947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:50:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.550 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5bba41-3b30-4518-ad6b-b4b68eef18ba]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.552 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, column=external_ids, values=({'neutron:ovn-metadata-id': 'a75d91b6-054c-5910-b05b-ab1c1fe068f6'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.570 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.626 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.626 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.627 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:50:46 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.627 143542 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Feb  2 04:50:46 np0005604791 python3.9[144072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025845.621254-1423-185411318382995/.source.yaml _original_basename=.gtyzrjnz follow=False checksum=f6b794fee8fdd156223951721cad4bcef298320f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:50:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:47 np0005604791 systemd[1]: session-51.scope: Deactivated successfully.
Feb  2 04:50:47 np0005604791 systemd[1]: session-51.scope: Consumed 50.394s CPU time.
Feb  2 04:50:47 np0005604791 systemd-logind[805]: Session 51 logged out. Waiting for processes to exit.
Feb  2 04:50:47 np0005604791 systemd-logind[805]: Removed session 51.
Feb  2 04:50:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:50:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:50:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:47.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:47.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:49.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:51.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:51.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:53 np0005604791 systemd-logind[805]: New session 52 of user zuul.
Feb  2 04:50:53 np0005604791 systemd[1]: Started Session 52 of User zuul.
Feb  2 04:50:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:53.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:53.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:54 np0005604791 python3.9[144290]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:50:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:55 np0005604791 python3.9[144474]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:50:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:55.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:55.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:50:56 np0005604791 python3.9[144641]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:50:56 np0005604791 systemd[1]: Reloading.
Feb  2 04:50:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:57 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:50:57 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:50:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:50:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:57.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:50:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:57.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:58 np0005604791 python3.9[144831]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:50:58 np0005604791 network[144848]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:50:58 np0005604791 network[144849]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:50:58 np0005604791 network[144850]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:50:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:50:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:59.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:50:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:50:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:50:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:59.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095100 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:51:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:01.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:01 np0005604791 python3.9[145116]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:01.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:02 np0005604791 python3.9[145269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:03 np0005604791 python3.9[145425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:03.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:03.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:03 np0005604791 podman[145550]: 2026-02-02 09:51:03.874381129 +0000 UTC m=+0.129411330 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 04:51:04 np0005604791 python3.9[145594]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:05 np0005604791 python3.9[145762]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:05.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:05.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:05 np0005604791 python3.9[145917]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:06 np0005604791 python3.9[146070]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:51:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:07.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:07.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:07 np0005604791 python3.9[146226]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:08 np0005604791 python3.9[146378]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:09 np0005604791 python3.9[146531]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:09.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:09.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:09 np0005604791 python3.9[146685]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:51:10 np0005604791 python3.9[146837]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:11 np0005604791 python3.9[146992]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:11.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:11 np0005604791 python3.9[147146]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:11.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095112 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:51:12 np0005604791 python3.9[147300]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:51:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:51:13 np0005604791 podman[147453]: 2026-02-02 09:51:13.136908298 +0000 UTC m=+0.065718343 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:51:13 np0005604791 python3.9[147454]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:13.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:13 np0005604791 python3.9[147628]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:13.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:14 np0005604791 python3.9[147780]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:15 np0005604791 python3.9[147958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:15.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:15 np0005604791 python3.9[148112]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:15.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:16 np0005604791 python3.9[148264]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:51:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:51:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:51:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:17 np0005604791 python3.9[148419]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:17.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:17.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:18 np0005604791 python3.9[148571]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:51:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:19 np0005604791 python3.9[148726]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:51:19 np0005604791 systemd[1]: Reloading.
Feb  2 04:51:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:19 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:51:19 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:51:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:19.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:51:20 np0005604791 python3.9[148915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:20 np0005604791 python3.9[149069]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:21 np0005604791 python3.9[149224]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:21.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:21.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:22 np0005604791 python3.9[149377]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:51:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:23.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:23.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:23 np0005604791 python3.9[149533]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:24 np0005604791 python3.9[149688]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:25 np0005604791 python3.9[149842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:51:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:25.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:25.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:51:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:51:26 np0005604791 python3.9[149997]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb  2 04:51:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095126 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:51:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:27 np0005604791 python3.9[150153]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:51:27 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:51:27 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:51:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:27.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:27.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:28 np0005604791 python3.9[150314]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb  2 04:51:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:51:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:29.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:29 np0005604791 python3.9[150475]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:51:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:29.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:30 np0005604791 python3.9[150561]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:51:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:31.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:31.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095132 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:51:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:33.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:33.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:34 np0005604791 podman[150579]: 2026-02-02 09:51:34.462390325 +0000 UTC m=+0.120311581 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb  2 04:51:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:35.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:37.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:37.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:39.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:41.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:41.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:43 np0005604791 podman[150821]: 2026-02-02 09:51:43.39316664 +0000 UTC m=+0.064096281 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb  2 04:51:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:43.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:43.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.889 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:51:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:51:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:51:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:45.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:45.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:47.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:51:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:51:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:51:47 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:51:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095148 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:51:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:49.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:49.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:51.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:51.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095152 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:51:52 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:51:52 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:51:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:53 np0005604791 kernel: SELinux:  Converting 2782 SID table entries...
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:51:53 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:51:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:53.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:53.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:54 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb  2 04:51:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:55.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:55.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:51:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:51:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:57.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:51:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:51:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:57.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:51:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:51:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:51:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:59.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:51:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:51:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:51:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:59.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:52:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:52:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:01.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:01 np0005604791 kernel: SELinux:  Converting 2782 SID table entries...
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:52:01 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:52:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:01.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:52:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:03.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:03.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:05 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb  2 04:52:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:05 np0005604791 podman[151035]: 2026-02-02 09:52:05.422182048 +0000 UTC m=+0.087817959 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:52:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:05.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:52:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:07.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:07.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095208 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:52:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:52:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:52:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:09.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:09.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:11.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:11.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:52:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:13.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:13.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:14 np0005604791 podman[152706]: 2026-02-02 09:52:14.374048612 +0000 UTC m=+0.049265997 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb  2 04:52:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:15.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:15.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095216 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:52:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:17.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:17.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 04:52:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:19.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 04:52:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:19.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:21.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:21.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:23.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:23.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:25.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095226 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:52:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:27.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:27.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:29.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:29.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:31.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:31.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:33.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:33.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:52:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:52:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:35.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:36 np0005604791 podman[168041]: 2026-02-02 09:52:36.42145437 +0000 UTC m=+0.088953977 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb  2 04:52:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:52:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:37.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:39.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:52:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:52:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:41.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:41.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:52:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:43.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:43.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:52:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:52:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:52:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:45 np0005604791 podman[168103]: 2026-02-02 09:52:45.392893239 +0000 UTC m=+0.068672977 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb  2 04:52:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:45.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:45.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:46 np0005604791 kernel: SELinux:  Converting 2783 SID table entries...
Feb  2 04:52:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability network_peer_controls=1
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability open_perms=1
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability extended_socket_class=1
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability always_check_network=0
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb  2 04:52:46 np0005604791 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb  2 04:52:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:47 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:52:47 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb  2 04:52:47 np0005604791 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb  2 04:52:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:52:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:47.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:52:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095248 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:52:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:49.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:51.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:51.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:53 np0005604791 podman[168624]: 2026-02-02 09:52:53.40935939 +0000 UTC m=+0.072620152 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Feb  2 04:52:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:53 np0005604791 podman[168624]: 2026-02-02 09:52:53.553130254 +0000 UTC m=+0.216391006 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb  2 04:52:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:53.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:53 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 04:52:54 np0005604791 podman[169252]: 2026-02-02 09:52:54.142302234 +0000 UTC m=+0.071739789 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:52:54 np0005604791 podman[169252]: 2026-02-02 09:52:54.154710334 +0000 UTC m=+0.084147829 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 04:52:54 np0005604791 systemd[1]: Stopping OpenSSH server daemon...
Feb  2 04:52:54 np0005604791 systemd[1]: sshd.service: Deactivated successfully.
Feb  2 04:52:54 np0005604791 systemd[1]: Stopped OpenSSH server daemon.
Feb  2 04:52:54 np0005604791 systemd[1]: sshd.service: Consumed 13.475s CPU time, read 32.0K from disk, written 32.0K to disk.
Feb  2 04:52:54 np0005604791 systemd[1]: Stopped target sshd-keygen.target.
Feb  2 04:52:54 np0005604791 systemd[1]: Stopping sshd-keygen.target...
Feb  2 04:52:54 np0005604791 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:52:54 np0005604791 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:52:54 np0005604791 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb  2 04:52:54 np0005604791 systemd[1]: Reached target sshd-keygen.target.
Feb  2 04:52:54 np0005604791 systemd[1]: Starting OpenSSH server daemon...
Feb  2 04:52:54 np0005604791 systemd[1]: Started OpenSSH server daemon.
Feb  2 04:52:54 np0005604791 podman[169353]: 2026-02-02 09:52:54.427198932 +0000 UTC m=+0.072293424 container exec 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Feb  2 04:52:54 np0005604791 podman[169353]: 2026-02-02 09:52:54.45158612 +0000 UTC m=+0.096680612 container exec_died 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:52:54 np0005604791 podman[169446]: 2026-02-02 09:52:54.706535001 +0000 UTC m=+0.067593569 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:52:54 np0005604791 podman[169446]: 2026-02-02 09:52:54.73957335 +0000 UTC m=+0.100631848 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 04:52:54 np0005604791 podman[169544]: 2026-02-02 09:52:54.965759986 +0000 UTC m=+0.054217863 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-type=git)
Feb  2 04:52:55 np0005604791 podman[169544]: 2026-02-02 09:52:55.0226825 +0000 UTC m=+0.111140327 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, distribution-scope=public, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2)
Feb  2 04:52:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:55.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:56 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:52:56 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:52:56 np0005604791 systemd[1]: Reloading.
Feb  2 04:52:56 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:52:56 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:52:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:52:56 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:52:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:57 np0005604791 ceph-mon[80115]: Health check update: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb  2 04:52:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:57.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:57.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:52:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:59.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:52:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:52:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:52:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:59.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:53:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:53:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:01.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:03 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:53:03 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:53:03 np0005604791 systemd[1]: man-db-cache-update.service: Consumed 8.869s CPU time.
Feb  2 04:53:03 np0005604791 systemd[1]: run-rb09085d442244dd7bc625d76dab9b4fc.service: Deactivated successfully.
Feb  2 04:53:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:03.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:03.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:05 np0005604791 auditd[706]: Audit daemon rotating log files
Feb  2 04:53:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:05.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:06 np0005604791 podman[178326]: 2026-02-02 09:53:06.835590033 +0000 UTC m=+0.151707296 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb  2 04:53:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:07.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:07.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:08 np0005604791 python3.9[178482]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:53:08 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:08 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:08 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:09 np0005604791 python3.9[178676]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:53:09 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:09.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:10 np0005604791 python3.9[178866]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:53:10 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:10 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:10 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:11 np0005604791 python3.9[179060]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:53:11 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:11 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:11 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:11.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:13 np0005604791 python3.9[179252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:13 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:13 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:13 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:13.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:53:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:13.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:53:14 np0005604791 python3.9[179444]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:14 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:14 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:14 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:15 np0005604791 python3.9[179637]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:15 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:15 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:15 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:15 np0005604791 podman[179700]: 2026-02-02 09:53:15.70952055 +0000 UTC m=+0.075979991 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 04:53:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:15.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:15.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:16 np0005604791 python3.9[179870]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:17 np0005604791 python3.9[180028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:17 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:17 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:17 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:17.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:17.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:19 np0005604791 python3.9[180220]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb  2 04:53:19 np0005604791 systemd[1]: Reloading.
Feb  2 04:53:19 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:53:19 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:53:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:19 np0005604791 systemd[1]: Listening on libvirt proxy daemon socket.
Feb  2 04:53:19 np0005604791 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb  2 04:53:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:19.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:19.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:21 np0005604791 python3.9[180418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:53:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:21.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:53:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:21.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:22 np0005604791 python3.9[180573]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:23 np0005604791 python3.9[180731]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:53:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:23.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:53:23 np0005604791 python3.9[180887]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:24 np0005604791 python3.9[181044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:25 np0005604791 python3.9[181200]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:25.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:26 np0005604791 python3.9[181357]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:27 np0005604791 python3.9[181515]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:27.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:27.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:27 np0005604791 python3.9[181670]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:28 np0005604791 python3.9[181827]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:29 np0005604791 python3.9[181983]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:29.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:29.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:30 np0005604791 python3.9[182140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:31 np0005604791 python3.9[182296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy ignored for local
Feb  2 04:53:31 np0005604791 kernel: ganesha.nfsd[151860]: segfault at 50 ip 00007f4e616b332e sp 00007f4dd77fd210 error 4 in libntirpc.so.5.8[7f4e61698000+2c000] likely on CPU 7 (core 0, socket 7)
Feb  2 04:53:31 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:53:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:31 np0005604791 systemd[1]: Started Process Core Dump (PID 182377/UID 0).
Feb  2 04:53:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:31.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:31.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:31 np0005604791 python3.9[182455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb  2 04:53:32 np0005604791 systemd-coredump[182383]: Process 122314 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 74:#012#0  0x00007f4e616b332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 04:53:32 np0005604791 systemd[1]: systemd-coredump@5-182377-0.service: Deactivated successfully.
Feb  2 04:53:32 np0005604791 podman[182487]: 2026-02-02 09:53:32.57527812 +0000 UTC m=+0.042176191 container died 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 04:53:32 np0005604791 systemd[1]: var-lib-containers-storage-overlay-8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0-merged.mount: Deactivated successfully.
Feb  2 04:53:32 np0005604791 podman[182487]: 2026-02-02 09:53:32.620558139 +0000 UTC m=+0.087456220 container remove 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Feb  2 04:53:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:53:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:53:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.755s CPU time.
Feb  2 04:53:33 np0005604791 python3.9[182661]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:33.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:34 np0005604791 python3.9[182813]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:34 np0005604791 python3.9[182967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:35 np0005604791 python3.9[183120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:35 np0005604791 python3.9[183273]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:35.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:35.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:36 np0005604791 python3.9[183451]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:53:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:37 np0005604791 podman[183578]: 2026-02-02 09:53:37.173288406 +0000 UTC m=+0.110173206 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb  2 04:53:37 np0005604791 python3.9[183613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:53:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095337 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:53:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:37.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:37.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:38 np0005604791 python3.9[183782]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:39 np0005604791 python3.9[183910]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026017.6431115-1642-205544022524164/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095339 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:53:39 np0005604791 python3.9[184062]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:39.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:39.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:40 np0005604791 python3.9[184189]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026019.2232573-1642-70406926936368/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:40 np0005604791 python3.9[184342]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:41 np0005604791 python3.9[184467]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026020.4817348-1642-249894984223714/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:41.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:41.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:42 np0005604791 python3.9[184621]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:42 np0005604791 python3.9[184746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026021.6501741-1642-99336805560282/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:42 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 6.
Feb  2 04:53:42 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:53:42 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.755s CPU time.
Feb  2 04:53:42 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:53:43 np0005604791 podman[184897]: 2026-02-02 09:53:43.059357919 +0000 UTC m=+0.054285489 container create 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Feb  2 04:53:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:53:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:53:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:53:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:53:43 np0005604791 podman[184897]: 2026-02-02 09:53:43.038489549 +0000 UTC m=+0.033417149 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:53:43 np0005604791 podman[184897]: 2026-02-02 09:53:43.134069474 +0000 UTC m=+0.128997084 container init 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 04:53:43 np0005604791 podman[184897]: 2026-02-02 09:53:43.14217935 +0000 UTC m=+0.137106930 container start 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Feb  2 04:53:43 np0005604791 bash[184897]: 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7
Feb  2 04:53:43 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:53:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:53:43 np0005604791 python3.9[185007]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:43.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:43.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:44 np0005604791 python3.9[185132]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026022.7632465-1642-51387862359910/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:44 np0005604791 python3.9[185286]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:53:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.892 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:53:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:53:45 np0005604791 python3.9[185412]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026024.2034993-1642-18020966382232/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:45 np0005604791 python3.9[185564]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:45.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:45.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:46 np0005604791 podman[185637]: 2026-02-02 09:53:46.387336425 +0000 UTC m=+0.064521028 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb  2 04:53:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:46 np0005604791 python3.9[185708]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026025.3671057-1642-42392861843968/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:47 np0005604791 python3.9[185861]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:47.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:47 np0005604791 python3.9[185988]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026026.9117117-1642-36548457365212/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:47.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:49 np0005604791 python3.9[186143]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb  2 04:53:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:53:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:53:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:49 np0005604791 python3.9[186296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:49.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:50 np0005604791 python3.9[186450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:51 np0005604791 python3.9[186603]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:51.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:51 np0005604791 python3.9[186757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb  2 04:53:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:53:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:53:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:53:52 np0005604791 python3.9[186909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:53 np0005604791 python3.9[187064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:53 np0005604791 python3.9[187216]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:53.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:54 np0005604791 python3.9[187370]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:55 np0005604791 python3.9[187523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6140016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:55.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:53:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:53:56 np0005604791 python3.9[187695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:53:56 np0005604791 python3.9[187869]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:57 np0005604791 python3.9[188024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095357 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:53:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:57.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:53:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:53:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:57.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:53:58 np0005604791 python3.9[188176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:58 np0005604791 python3.9[188328]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:53:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095359 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:53:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:53:59 np0005604791 python3.9[188483]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:53:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:53:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:53:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:59.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:59.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:00 np0005604791 python3.9[188606]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026039.1509202-2305-190010660507789/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:00 np0005604791 python3.9[188760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:01 np0005604791 python3.9[188884]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026040.2868788-2305-149784564914833/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:01.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:02 np0005604791 python3.9[189102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:02.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:54:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:54:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:54:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:54:02 np0005604791 python3.9[189244]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026041.5640469-2305-110359189699812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:03 np0005604791 python3.9[189399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:03 np0005604791 python3.9[189522]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026042.6770968-2305-174616988064089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:03.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:04.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:04 np0005604791 python3.9[189674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:04 np0005604791 python3.9[189797]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026043.8491817-2305-235956656393227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:05 np0005604791 python3.9[189950]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:05.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:06.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:06 np0005604791 python3.9[190075]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026044.985122-2305-238934063252152/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:06 np0005604791 python3.9[190227]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:54:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:54:07 np0005604791 python3.9[190378]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026046.244873-2305-185985929790888/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:07 np0005604791 podman[190379]: 2026-02-02 09:54:07.435763389 +0000 UTC m=+0.099723480 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Feb  2 04:54:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:07.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:07 np0005604791 python3.9[190558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:08.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:54:08 np0005604791 python3.9[190681]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026047.4871-2305-277857923965015/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:09 np0005604791 python3.9[190834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:09 np0005604791 python3.9[190959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026048.6744466-2305-242883492080263/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:09.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:10.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:10 np0005604791 python3.9[191111]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:10 np0005604791 python3.9[191235]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026049.8597515-2305-77005213568255/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095411 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:54:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:11 np0005604791 python3.9[191389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:11.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:12 np0005604791 python3.9[191514]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026051.1267045-2305-252374458956224/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:13 np0005604791 python3.9[191667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:13 np0005604791 python3.9[191790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026052.9545631-2305-277997031214432/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:13.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:14.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:14 np0005604791 python3.9[191944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:15 np0005604791 python3.9[192068]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026054.1463447-2305-195197278954890/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:15 np0005604791 python3.9[192222]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:15.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:16.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:16 np0005604791 python3.9[192370]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026055.3469117-2305-211702492162504/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:17 np0005604791 podman[192497]: 2026-02-02 09:54:17.054751432 +0000 UTC m=+0.076840630 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb  2 04:54:17 np0005604791 python3.9[192534]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:17.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:18.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:18 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:54:18 np0005604791 python3.9[192697]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb  2 04:54:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:19.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:20 np0005604791 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb  2 04:54:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:20.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:20 np0005604791 python3.9[192856]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:20 np0005604791 python3.9[193009]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:21 np0005604791 python3.9[193163]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:21.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:22.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:22 np0005604791 python3.9[193317]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:22 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:54:23 np0005604791 python3.9[193470]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:23.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:24.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:24 np0005604791 python3.9[193624]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:24 np0005604791 python3.9[193778]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:25 np0005604791 python3.9[193931]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:54:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:26 np0005604791 python3.9[194083]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:26 np0005604791 python3.9[194237]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:27 np0005604791 python3.9[194392]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:54:27 np0005604791 systemd[1]: Reloading.
Feb  2 04:54:27 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:54:27 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:54:27 np0005604791 systemd[1]: Starting libvirt logging daemon socket...
Feb  2 04:54:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:27 np0005604791 systemd[1]: Listening on libvirt logging daemon socket.
Feb  2 04:54:27 np0005604791 systemd[1]: Starting libvirt logging daemon admin socket...
Feb  2 04:54:27 np0005604791 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb  2 04:54:27 np0005604791 systemd[1]: Starting libvirt logging daemon...
Feb  2 04:54:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:28 np0005604791 systemd[1]: Started libvirt logging daemon.
Feb  2 04:54:28 np0005604791 python3.9[194587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:54:28 np0005604791 systemd[1]: Reloading.
Feb  2 04:54:29 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:54:29 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:54:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:29 np0005604791 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb  2 04:54:29 np0005604791 systemd[1]: Starting libvirt nodedev daemon socket...
Feb  2 04:54:29 np0005604791 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb  2 04:54:29 np0005604791 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb  2 04:54:29 np0005604791 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb  2 04:54:29 np0005604791 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb  2 04:54:29 np0005604791 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb  2 04:54:29 np0005604791 systemd[1]: Starting libvirt nodedev daemon...
Feb  2 04:54:29 np0005604791 systemd[1]: Started libvirt nodedev daemon.
Feb  2 04:54:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:29 np0005604791 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb  2 04:54:29 np0005604791 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb  2 04:54:29 np0005604791 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb  2 04:54:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:30.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:30 np0005604791 python3.9[194814]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:54:30 np0005604791 systemd[1]: Reloading.
Feb  2 04:54:30 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:54:30 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:54:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095430 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:54:30 np0005604791 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb  2 04:54:30 np0005604791 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb  2 04:54:30 np0005604791 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb  2 04:54:30 np0005604791 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb  2 04:54:30 np0005604791 systemd[1]: Starting libvirt proxy daemon...
Feb  2 04:54:30 np0005604791 systemd[1]: Started libvirt proxy daemon.
Feb  2 04:54:30 np0005604791 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6c0cf167-02db-439b-80c9-cd599f4d6f27
Feb  2 04:54:30 np0005604791 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Feb  2 04:54:30 np0005604791 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6c0cf167-02db-439b-80c9-cd599f4d6f27
Feb  2 04:54:30 np0005604791 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Feb  2 04:54:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095431 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:54:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:31 np0005604791 python3.9[195030]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:54:31 np0005604791 systemd[1]: Reloading.
Feb  2 04:54:31 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:54:31 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:54:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:31 np0005604791 systemd[1]: Listening on libvirt locking daemon socket.
Feb  2 04:54:31 np0005604791 systemd[1]: Starting libvirt QEMU daemon socket...
Feb  2 04:54:31 np0005604791 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb  2 04:54:31 np0005604791 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb  2 04:54:31 np0005604791 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb  2 04:54:31 np0005604791 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb  2 04:54:31 np0005604791 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb  2 04:54:31 np0005604791 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb  2 04:54:31 np0005604791 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb  2 04:54:31 np0005604791 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb  2 04:54:31 np0005604791 systemd[1]: Starting libvirt QEMU daemon...
Feb  2 04:54:31 np0005604791 systemd[1]: Started libvirt QEMU daemon.
Feb  2 04:54:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:54:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:54:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:32 np0005604791 python3.9[195247]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:54:32 np0005604791 systemd[1]: Reloading.
Feb  2 04:54:32 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:54:32 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:54:32 np0005604791 systemd[1]: Starting libvirt secret daemon socket...
Feb  2 04:54:32 np0005604791 systemd[1]: Listening on libvirt secret daemon socket.
Feb  2 04:54:32 np0005604791 systemd[1]: Starting libvirt secret daemon admin socket...
Feb  2 04:54:32 np0005604791 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb  2 04:54:32 np0005604791 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb  2 04:54:32 np0005604791 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb  2 04:54:32 np0005604791 systemd[1]: Starting libvirt secret daemon...
Feb  2 04:54:32 np0005604791 systemd[1]: Started libvirt secret daemon.
Feb  2 04:54:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0001840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:33 np0005604791 python3.9[195462]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:33.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:34 np0005604791 python3.9[195616]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:54:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:35 np0005604791 python3.9[195769]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0001840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:35 np0005604791 python3.9[195923]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:54:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:35.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:36.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:36 np0005604791 python3.9[196102]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:37 np0005604791 python3.9[196223]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026076.4173043-3379-269828669694851/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19e72152fe151d80bf9ff9b6a78f27bac75d38a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:37 np0005604791 podman[196224]: 2026-02-02 09:54:37.65324014 +0000 UTC m=+0.130560797 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb  2 04:54:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:37.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:38.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:38 np0005604791 python3.9[196403]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine d241d473-9fcb-5f74-b163-f1ca4454e7f1#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:39 np0005604791 python3.9[196566]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:39.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:40.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:40 np0005604791 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb  2 04:54:40 np0005604791 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb  2 04:54:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:41 np0005604791 python3.9[197035]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:41.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:42.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:42 np0005604791 python3.9[197189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:42 np0005604791 python3.9[197312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026081.8428671-3544-96012965375489/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:43 np0005604791 python3.9[197467]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:43.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:44.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:44 np0005604791 python3.9[197621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:54:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:54:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:54:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:45 np0005604791 python3.9[197700]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:45.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:46 np0005604791 python3.9[197852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:46.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:46 np0005604791 python3.9[197932]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7q_xu196 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:47 np0005604791 podman[198057]: 2026-02-02 09:54:47.173233697 +0000 UTC m=+0.076882422 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Feb  2 04:54:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:47 np0005604791 python3.9[198103]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:47 np0005604791 python3.9[198184]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:47.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:48.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:48 np0005604791 python3.9[198336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:49 np0005604791 python3[198492]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb  2 04:54:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:49.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:50.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:50 np0005604791 python3.9[198644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:50 np0005604791 python3.9[198725]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:51 np0005604791 python3.9[198877]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:54:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:51.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:54:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:52.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095452 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:54:52 np0005604791 python3.9[199004]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026091.3288875-3811-263282415164707/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:53 np0005604791 python3.9[199157]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:53 np0005604791 python3.9[199237]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:54.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:54 np0005604791 python3.9[199389]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:55 np0005604791 python3.9[199468]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:55 np0005604791 python3.9[199620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:54:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:55.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:54:56 np0005604791 python3.9[199772]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026095.3225706-3928-83691514853329/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:57 np0005604791 python3.9[199925]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:54:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:57.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:54:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:54:58 np0005604791 python3.9[200079]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:58 np0005604791 python3.9[200235]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:54:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:54:59 np0005604791 python3.9[200389]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:54:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:54:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:54:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:59.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:00.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:00 np0005604791 python3.9[200544]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:55:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:01 np0005604791 python3.9[200701]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:55:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:01.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:02 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:55:02 np0005604791 python3.9[200858]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:03 np0005604791 python3.9[201011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00028c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:03 np0005604791 python3.9[201136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026102.6988423-4144-276850308272281/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:04.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:04.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:04 np0005604791 python3.9[201288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:55:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:55:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:05 np0005604791 python3.9[201414]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026104.0811508-4189-75525924766860/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:06.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:06 np0005604791 python3.9[201566]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:06.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:06 np0005604791 python3.9[201689]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026105.5721936-4234-221588767828483/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:07 np0005604791 python3.9[201868]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:55:07 np0005604791 systemd[1]: Reloading.
Feb  2 04:55:07 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:55:07 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:55:07 np0005604791 systemd[1]: Reached target edpm_libvirt.target.
Feb  2 04:55:07 np0005604791 podman[201959]: 2026-02-02 09:55:07.976391031 +0000 UTC m=+0.079100741 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb  2 04:55:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:08.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:08.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:08 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:55:08 np0005604791 python3.9[202140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb  2 04:55:08 np0005604791 systemd[1]: Reloading.
Feb  2 04:55:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:55:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:55:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:09 np0005604791 systemd[1]: Reloading.
Feb  2 04:55:09 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:55:09 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:55:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:10.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:10.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:10 np0005604791 systemd[1]: session-52.scope: Deactivated successfully.
Feb  2 04:55:10 np0005604791 systemd[1]: session-52.scope: Consumed 3min 11.353s CPU time.
Feb  2 04:55:10 np0005604791 systemd-logind[805]: Session 52 logged out. Waiting for processes to exit.
Feb  2 04:55:10 np0005604791 systemd-logind[805]: Removed session 52.
Feb  2 04:55:10 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:55:10 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:55:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:12.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:12 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:55:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:13 np0005604791 ceph-mon[80115]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Feb  2 04:55:13 np0005604791 ceph-mon[80115]: Cluster is now healthy
Feb  2 04:55:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:14.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:14.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095514 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:55:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:15 np0005604791 systemd-logind[805]: New session 53 of user zuul.
Feb  2 04:55:15 np0005604791 systemd[1]: Started Session 53 of User zuul.
Feb  2 04:55:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:16.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:16.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:16 np0005604791 python3.9[202420]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.198976) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117199082, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4666, "num_deletes": 502, "total_data_size": 12826185, "memory_usage": 12996592, "flush_reason": "Manual Compaction"}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb  2 04:55:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117252415, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8261921, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13324, "largest_seqno": 17985, "table_properties": {"data_size": 8244315, "index_size": 11860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36899, "raw_average_key_size": 19, "raw_value_size": 8207635, "raw_average_value_size": 4393, "num_data_blocks": 518, "num_entries": 1868, "num_filter_entries": 1868, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025678, "oldest_key_time": 1770025678, "file_creation_time": 1770026117, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 53581 microseconds, and 18215 cpu microseconds.
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.252564) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8261921 bytes OK
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.252622) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259331) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259353) EVENT_LOG_v1 {"time_micros": 1770026117259346, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259372) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12805597, prev total WAL file size 12842136, number of live WAL files 2.
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.261741) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8068KB)], [27(11MB)]
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117261831, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 20464118, "oldest_snapshot_seqno": -1}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5083 keys, 15279652 bytes, temperature: kUnknown
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117389434, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15279652, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15241110, "index_size": 24736, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12741, "raw_key_size": 127177, "raw_average_key_size": 25, "raw_value_size": 15144227, "raw_average_value_size": 2979, "num_data_blocks": 1039, "num_entries": 5083, "num_filter_entries": 5083, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026117, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.389803) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15279652 bytes
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.391999) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.2 rd, 119.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.9, 11.6 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(4.3) write-amplify(1.8) OK, records in: 6106, records dropped: 1023 output_compression: NoCompression
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.392037) EVENT_LOG_v1 {"time_micros": 1770026117392020, "job": 14, "event": "compaction_finished", "compaction_time_micros": 127722, "compaction_time_cpu_micros": 37846, "output_level": 6, "num_output_files": 1, "total_output_size": 15279652, "num_input_records": 6106, "num_output_records": 5083, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117394107, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117396371, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.261615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396441) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:55:17 np0005604791 podman[202450]: 2026-02-02 09:55:17.410304372 +0000 UTC m=+0.081715001 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb  2 04:55:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:18.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:18.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:18 np0005604791 python3.9[202619]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:55:18 np0005604791 network[202636]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:55:18 np0005604791 network[202637]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:55:18 np0005604791 network[202638]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:55:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:20.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:20.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:22.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:22.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:23 np0005604791 python3.9[202913]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb  2 04:55:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:24 np0005604791 python3.9[202997]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:55:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:24.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:24.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.003000079s ======
Feb  2 04:55:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:26.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Feb  2 04:55:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:26.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:28.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:30.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:30.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:30 np0005604791 python3.9[203153]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:55:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:31 np0005604791 python3.9[203306]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:55:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 04:55:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:32.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 04:55:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:32.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:32 np0005604791 python3.9[203459]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:55:33 np0005604791 python3.9[203612]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:55:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:33 np0005604791 python3.9[203767]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:34.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:34.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:34 np0005604791 python3.9[203890]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026133.535011-241-58220896284172/.source.iscsi _original_basename=.l_7mqm52 follow=False checksum=7a06202972a249b967f1a984bcbd8e2cf3a33f8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:35 np0005604791 python3.9[204043]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:36.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:36.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:36 np0005604791 python3.9[204195]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:37 np0005604791 python3.9[204373]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:55:37 np0005604791 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb  2 04:55:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:38.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:38.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:38 np0005604791 podman[204477]: 2026-02-02 09:55:38.396941418 +0000 UTC m=+0.074332415 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 04:55:38 np0005604791 python3.9[204556]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:55:38 np0005604791 systemd[1]: Reloading.
Feb  2 04:55:38 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:55:38 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:55:39 np0005604791 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb  2 04:55:39 np0005604791 systemd[1]: Starting Open-iSCSI...
Feb  2 04:55:39 np0005604791 kernel: Loading iSCSI transport class v2.0-870.
Feb  2 04:55:39 np0005604791 systemd[1]: Started Open-iSCSI.
Feb  2 04:55:39 np0005604791 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb  2 04:55:39 np0005604791 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb  2 04:55:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:40.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:40.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:40 np0005604791 python3.9[204755]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:55:40 np0005604791 network[204772]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:55:40 np0005604791 network[204773]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:55:40 np0005604791 network[204774]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:55:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.021858) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142021895, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 478, "num_deletes": 250, "total_data_size": 753496, "memory_usage": 763376, "flush_reason": "Manual Compaction"}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142026720, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 400569, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17990, "largest_seqno": 18463, "table_properties": {"data_size": 398101, "index_size": 568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6412, "raw_average_key_size": 19, "raw_value_size": 393094, "raw_average_value_size": 1202, "num_data_blocks": 26, "num_entries": 327, "num_filter_entries": 327, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026117, "oldest_key_time": 1770026117, "file_creation_time": 1770026142, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 4908 microseconds, and 1932 cpu microseconds.
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.026764) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 400569 bytes OK
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.026781) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028686) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028701) EVENT_LOG_v1 {"time_micros": 1770026142028697, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028716) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 750596, prev total WAL file size 750596, number of live WAL files 2.
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.029122) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(391KB)], [30(14MB)]
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142029228, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 15680221, "oldest_snapshot_seqno": -1}
Feb  2 04:55:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:42.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4906 keys, 11699367 bytes, temperature: kUnknown
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142112273, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11699367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11666307, "index_size": 19702, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12293, "raw_key_size": 123909, "raw_average_key_size": 25, "raw_value_size": 11576842, "raw_average_value_size": 2359, "num_data_blocks": 819, "num_entries": 4906, "num_filter_entries": 4906, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026142, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.112665) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11699367 bytes
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.117815) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.5 rd, 140.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 14.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(68.4) write-amplify(29.2) OK, records in: 5410, records dropped: 504 output_compression: NoCompression
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.117838) EVENT_LOG_v1 {"time_micros": 1770026142117829, "job": 16, "event": "compaction_finished", "compaction_time_micros": 83181, "compaction_time_cpu_micros": 32724, "output_level": 6, "num_output_files": 1, "total_output_size": 11699367, "num_input_records": 5410, "num_output_records": 4906, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142117987, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142119649, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.029017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:55:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:44.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:44 np0005604791 python3.9[205048]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:55:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.894 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:55:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.895 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:55:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.895 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:55:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:46.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:46.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:46 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:55:46 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:55:46 np0005604791 systemd[1]: Reloading.
Feb  2 04:55:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:46 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:55:46 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:55:46 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:55:47 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:55:47 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:55:47 np0005604791 systemd[1]: run-r2b981ab0999c4ed39aa5df20ad3a9b03.service: Deactivated successfully.
Feb  2 04:55:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:47 np0005604791 podman[205339]: 2026-02-02 09:55:47.985952676 +0000 UTC m=+0.107899086 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb  2 04:55:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:48.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:48 np0005604791 python3.9[205382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb  2 04:55:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:48.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:48 np0005604791 python3.9[205539]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb  2 04:55:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:49 np0005604791 python3.9[205696]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:50.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:50 np0005604791 python3.9[205819]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026149.2161922-505-160903731970247/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:51 np0005604791 python3.9[205972]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:52.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:52.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:52 np0005604791 python3.9[206124]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:55:52 np0005604791 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb  2 04:55:52 np0005604791 systemd[1]: Stopped Load Kernel Modules.
Feb  2 04:55:52 np0005604791 systemd[1]: Stopping Load Kernel Modules...
Feb  2 04:55:52 np0005604791 systemd[1]: Starting Load Kernel Modules...
Feb  2 04:55:52 np0005604791 systemd[1]: Finished Load Kernel Modules.
Feb  2 04:55:53 np0005604791 python3.9[206281]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:55:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:55:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:54.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:55:54 np0005604791 python3.9[206434]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:55:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:54.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:54 np0005604791 python3.9[206587]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:55:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:55 np0005604791 python3.9[206710]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026154.4315484-658-57031365423968/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:56.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:56.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:56 np0005604791 python3.9[206862]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:55:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:55:57 np0005604791 python3.9[207041]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:58 np0005604791 python3.9[207193]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:55:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:55:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:55:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:55:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:58.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:55:58 np0005604791 python3.9[207345]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:55:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:55:59 np0005604791 python3.9[207498]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:00.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:00.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:00 np0005604791 python3.9[207650]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:00 np0005604791 python3.9[207802]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:01 np0005604791 python3.9[207955]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:02.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:02 np0005604791 python3.9[208107]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:56:03 np0005604791 python3.9[208262]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095603 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:56:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:03 np0005604791 python3.9[208416]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:04 np0005604791 systemd[1]: Listening on multipathd control socket.
Feb  2 04:56:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:05 np0005604791 python3.9[208573]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:05 np0005604791 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb  2 04:56:05 np0005604791 udevadm[208578]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb  2 04:56:05 np0005604791 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb  2 04:56:05 np0005604791 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb  2 04:56:05 np0005604791 multipathd[208582]: --------start up--------
Feb  2 04:56:05 np0005604791 multipathd[208582]: read /etc/multipath.conf
Feb  2 04:56:05 np0005604791 multipathd[208582]: path checkers start up
Feb  2 04:56:05 np0005604791 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb  2 04:56:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:06.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:06.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:06 np0005604791 python3.9[208742]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb  2 04:56:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:07 np0005604791 python3.9[208895]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb  2 04:56:07 np0005604791 kernel: Key type psk registered
Feb  2 04:56:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:07 np0005604791 python3.9[209057]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:56:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:08.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:08.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:08 np0005604791 python3.9[209180]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026167.4130406-1048-108689643082974/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:09 np0005604791 podman[209305]: 2026-02-02 09:56:09.156201369 +0000 UTC m=+0.090244217 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb  2 04:56:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:09 np0005604791 python3.9[209352]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:10 np0005604791 python3.9[209512]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:56:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:10.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:11 np0005604791 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb  2 04:56:11 np0005604791 systemd[1]: Stopped Load Kernel Modules.
Feb  2 04:56:11 np0005604791 systemd[1]: Stopping Load Kernel Modules...
Feb  2 04:56:11 np0005604791 systemd[1]: Starting Load Kernel Modules...
Feb  2 04:56:11 np0005604791 systemd[1]: Finished Load Kernel Modules.
Feb  2 04:56:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:56:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:12.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:12.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:12 np0005604791 python3.9[209669]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb  2 04:56:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:14.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:14 np0005604791 systemd[1]: Reloading.
Feb  2 04:56:14 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:56:14 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:56:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:56:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:56:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:56:14 np0005604791 systemd[1]: Reloading.
Feb  2 04:56:14 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:56:14 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:56:15 np0005604791 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Feb  2 04:56:15 np0005604791 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb  2 04:56:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:15 np0005604791 lvm[209784]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 04:56:15 np0005604791 lvm[209784]: VG ceph_vg0 finished
Feb  2 04:56:15 np0005604791 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb  2 04:56:15 np0005604791 systemd[1]: Starting man-db-cache-update.service...
Feb  2 04:56:15 np0005604791 systemd[1]: Reloading.
Feb  2 04:56:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:15 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:56:15 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:56:15 np0005604791 systemd[1]: Queuing reload/restart jobs for marked units…
Feb  2 04:56:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:16.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:16 np0005604791 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb  2 04:56:16 np0005604791 systemd[1]: Finished man-db-cache-update.service.
Feb  2 04:56:16 np0005604791 systemd[1]: man-db-cache-update.service: Consumed 1.482s CPU time.
Feb  2 04:56:16 np0005604791 systemd[1]: run-rc9e90e4836ab4b33914d432eecb8e24a.service: Deactivated successfully.
Feb  2 04:56:17 np0005604791 python3.9[211164]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:56:17 np0005604791 systemd[1]: Stopping Open-iSCSI...
Feb  2 04:56:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:17 np0005604791 iscsid[204596]: iscsid shutting down.
Feb  2 04:56:17 np0005604791 systemd[1]: iscsid.service: Deactivated successfully.
Feb  2 04:56:17 np0005604791 systemd[1]: Stopped Open-iSCSI.
Feb  2 04:56:17 np0005604791 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb  2 04:56:17 np0005604791 systemd[1]: Starting Open-iSCSI...
Feb  2 04:56:17 np0005604791 systemd[1]: Started Open-iSCSI.
Feb  2 04:56:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:56:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:18.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:18 np0005604791 python3.9[211378]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:56:18 np0005604791 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb  2 04:56:18 np0005604791 multipathd[208582]: exit (signal)
Feb  2 04:56:18 np0005604791 multipathd[208582]: --------shut down-------
Feb  2 04:56:18 np0005604791 systemd[1]: multipathd.service: Deactivated successfully.
Feb  2 04:56:18 np0005604791 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb  2 04:56:18 np0005604791 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb  2 04:56:18 np0005604791 multipathd[211428]: --------start up--------
Feb  2 04:56:18 np0005604791 multipathd[211428]: read /etc/multipath.conf
Feb  2 04:56:18 np0005604791 multipathd[211428]: path checkers start up
Feb  2 04:56:18 np0005604791 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb  2 04:56:18 np0005604791 podman[211405]: 2026-02-02 09:56:18.36799089 +0000 UTC m=+0.094911781 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 04:56:18 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:56:18 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:56:18 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:56:18 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:56:19 np0005604791 python3.9[211586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb  2 04:56:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:20.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:20 np0005604791 python3.9[211742]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:21 np0005604791 python3.9[211895]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:56:21 np0005604791 systemd[1]: Reloading.
Feb  2 04:56:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:21 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:56:21 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:56:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:22.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:22 np0005604791 python3.9[212080]: ansible-ansible.builtin.service_facts Invoked
Feb  2 04:56:22 np0005604791 network[212097]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb  2 04:56:22 np0005604791 network[212098]: 'network-scripts' will be removed from distribution in near future.
Feb  2 04:56:22 np0005604791 network[212099]: It is advised to switch to 'NetworkManager' instead for network management.
Feb  2 04:56:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095623 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:56:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:24.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:24.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:56:24 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:56:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 04:56:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 04:56:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:26.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:28.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:28 np0005604791 python3.9[212400]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:29 np0005604791 python3.9[212554]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:29 np0005604791 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb  2 04:56:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:30 np0005604791 python3.9[212708]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:30.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:30.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:30 np0005604791 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb  2 04:56:30 np0005604791 python3.9[212862]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:31 np0005604791 kernel: ganesha.nfsd[208288]: segfault at 50 ip 00007fb6a9d2f32e sp 00007fb611ffa210 error 4 in libntirpc.so.5.8[7fb6a9d14000+2c000] likely on CPU 5 (core 0, socket 5)
Feb  2 04:56:31 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:56:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy ignored for local
Feb  2 04:56:31 np0005604791 systemd[1]: Started Process Core Dump (PID 213017/UID 0).
Feb  2 04:56:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:31 np0005604791 python3.9[213016]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:32.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:32 np0005604791 systemd-coredump[213018]: Process 184917 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 62:#012#0  0x00007fb6a9d2f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 04:56:32 np0005604791 systemd[1]: systemd-coredump@6-213017-0.service: Deactivated successfully.
Feb  2 04:56:32 np0005604791 python3.9[213171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:32 np0005604791 podman[213176]: 2026-02-02 09:56:32.491089344 +0000 UTC m=+0.039077289 container died 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb  2 04:56:32 np0005604791 systemd[1]: var-lib-containers-storage-overlay-b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c-merged.mount: Deactivated successfully.
Feb  2 04:56:32 np0005604791 podman[213176]: 2026-02-02 09:56:32.52594949 +0000 UTC m=+0.073937445 container remove 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 04:56:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:56:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:56:32 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.287s CPU time.
Feb  2 04:56:33 np0005604791 python3.9[213372]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:34 np0005604791 python3.9[213525]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:56:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:34.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:35 np0005604791 python3.9[213679]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:35 np0005604791 python3.9[213831]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:36.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:36 np0005604791 python3.9[214008]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:37 np0005604791 python3.9[214161]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095637 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:56:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:38.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:38 np0005604791 python3.9[214313]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:38.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:38 np0005604791 python3.9[214465]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:39 np0005604791 podman[214618]: 2026-02-02 09:56:39.303244744 +0000 UTC m=+0.089203269 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb  2 04:56:39 np0005604791 python3.9[214619]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:40 np0005604791 python3.9[214796]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:40.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:40.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:40 np0005604791 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb  2 04:56:40 np0005604791 systemd[1]: virtqemud.service: Deactivated successfully.
Feb  2 04:56:41 np0005604791 python3.9[214951]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:41 np0005604791 python3.9[215103]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:42.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:42.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:42 np0005604791 python3.9[215255]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:42 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 7.
Feb  2 04:56:42 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:56:42 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.287s CPU time.
Feb  2 04:56:42 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:56:43 np0005604791 podman[215455]: 2026-02-02 09:56:43.05170777 +0000 UTC m=+0.047347511 container create a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True)
Feb  2 04:56:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:56:43 np0005604791 python3.9[215425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:56:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:56:43 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:56:43 np0005604791 podman[215455]: 2026-02-02 09:56:43.025924671 +0000 UTC m=+0.021564472 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:56:43 np0005604791 podman[215455]: 2026-02-02 09:56:43.140711453 +0000 UTC m=+0.136351214 container init a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Feb  2 04:56:43 np0005604791 podman[215455]: 2026-02-02 09:56:43.144328385 +0000 UTC m=+0.139968126 container start a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:56:43 np0005604791 bash[215455]: a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:56:43 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:56:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:56:44 np0005604791 python3.9[215663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:44.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:44 np0005604791 python3.9[215815]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.896 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:56:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.896 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:56:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.897 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:56:45 np0005604791 python3.9[215968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:45 np0005604791 python3.9[216120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:56:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:46.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:46 np0005604791 python3.9[216272]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:47 np0005604791 python3.9[216425]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb  2 04:56:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:48.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:48.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:48 np0005604791 podman[216549]: 2026-02-02 09:56:48.770497757 +0000 UTC m=+0.067876034 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 04:56:49 np0005604791 python3.9[216597]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:56:49 np0005604791 systemd[1]: Reloading.
Feb  2 04:56:49 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:56:49 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:56:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:49 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:56:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:49 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:56:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:50.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:50 np0005604791 python3.9[216785]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:50.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:50 np0005604791 python3.9[216938]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:51 np0005604791 python3.9[217092]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:52.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:52 np0005604791 python3.9[217245]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:52.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:52 np0005604791 python3.9[217398]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:53 np0005604791 python3.9[217552]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:54 np0005604791 python3.9[217705]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:54.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:54.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:54 np0005604791 python3.9[217858]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe354000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:56:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:56:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:56:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:56.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:56:56 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:56:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:57 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe348000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:56:57 np0005604791 kernel: ganesha.nfsd[217899]: segfault at 50 ip 00007fe3f9da232e sp 00007fe3627fb210 error 4 in libntirpc.so.5.8[7fe3f9d87000+2c000] likely on CPU 0 (core 0, socket 0)
Feb  2 04:56:57 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:56:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:57 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe348000b60 fd 37 proxy ignored for local
Feb  2 04:56:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095657 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:56:57 np0005604791 systemd[1]: Started Process Core Dump (PID 218020/UID 0).
Feb  2 04:56:57 np0005604791 python3.9[218056]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:56:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:58.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:56:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:56:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:58.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:56:58 np0005604791 systemd-coredump[218027]: Process 215474 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007fe3f9da232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 04:56:58 np0005604791 systemd[1]: systemd-coredump@7-218020-0.service: Deactivated successfully.
Feb  2 04:56:58 np0005604791 podman[218213]: 2026-02-02 09:56:58.461652777 +0000 UTC m=+0.025389749 container died a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:56:58 np0005604791 systemd[1]: var-lib-containers-storage-overlay-8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7-merged.mount: Deactivated successfully.
Feb  2 04:56:58 np0005604791 podman[218213]: 2026-02-02 09:56:58.49582355 +0000 UTC m=+0.059560502 container remove a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb  2 04:56:58 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:56:58 np0005604791 python3.9[218208]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:56:58 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:56:58 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.053s CPU time.
Feb  2 04:56:59 np0005604791 python3.9[218407]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:00 np0005604791 python3.9[218559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:00.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:00.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:00 np0005604791 python3.9[218711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:01 np0005604791 python3.9[218864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:02.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:02.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:02 np0005604791 python3.9[219016]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:02 np0005604791 python3.9[219169]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095703 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 04:57:03 np0005604791 python3.9[219321]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:04.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:04.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:04 np0005604791 python3.9[219473]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000051s ======
Feb  2 04:57:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:06.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Feb  2 04:57:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:06.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:06 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:08.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:08.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 8.
Feb  2 04:57:08 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:57:08 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.053s CPU time.
Feb  2 04:57:08 np0005604791 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb  2 04:57:09 np0005604791 podman[219547]: 2026-02-02 09:57:09.076616632 +0000 UTC m=+0.064123979 container create 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 04:57:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:09 np0005604791 podman[219547]: 2026-02-02 09:57:09.132010906 +0000 UTC m=+0.119518253 container init 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb  2 04:57:09 np0005604791 podman[219547]: 2026-02-02 09:57:09.044389148 +0000 UTC m=+0.031896545 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 04:57:09 np0005604791 podman[219547]: 2026-02-02 09:57:09.140665997 +0000 UTC m=+0.128173334 container start 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:57:09 np0005604791 bash[219547]: 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e
Feb  2 04:57:09 np0005604791 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb  2 04:57:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 04:57:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:10.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:10 np0005604791 podman[219704]: 2026-02-02 09:57:10.181819059 +0000 UTC m=+0.126082701 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb  2 04:57:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:10.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:10 np0005604791 python3.9[219746]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb  2 04:57:11 np0005604791 python3.9[219909]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb  2 04:57:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:12 np0005604791 python3.9[220067]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb  2 04:57:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:12 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:57:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:12.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:12 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:57:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:12.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:14 np0005604791 systemd-logind[805]: New session 54 of user zuul.
Feb  2 04:57:14 np0005604791 systemd[1]: Started Session 54 of User zuul.
Feb  2 04:57:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:14.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:14 np0005604791 systemd[1]: session-54.scope: Deactivated successfully.
Feb  2 04:57:14 np0005604791 systemd-logind[805]: Session 54 logged out. Waiting for processes to exit.
Feb  2 04:57:14 np0005604791 systemd-logind[805]: Removed session 54.
Feb  2 04:57:15 np0005604791 python3.9[220256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:15 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 04:57:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:15 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 04:57:15 np0005604791 python3.9[220377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026234.7935214-2655-231399672093957/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:16.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:16 np0005604791 python3.9[220527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:16 np0005604791 python3.9[220603]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:17 np0005604791 python3.9[220779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:17 np0005604791 python3.9[220900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026236.9955747-2655-255954910177455/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:57:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:57:18 np0005604791 python3.9[221050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:19 np0005604791 podman[221146]: 2026-02-02 09:57:19.171080561 +0000 UTC m=+0.066793577 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb  2 04:57:19 np0005604791 python3.9[221183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026238.0810962-2655-16037526371783/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:19 np0005604791 python3.9[221341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:20.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:20.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:20 np0005604791 python3.9[221462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026239.4295225-2655-85468870633035/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:21 np0005604791 python3.9[221613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc740000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:21 np0005604791 python3.9[221750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026240.5676095-2655-253180740455012/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:22.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:22.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:22 np0005604791 python3.9[221902]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:57:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:23 np0005604791 python3.9[222055]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:57:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095723 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 04:57:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:24.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:24 np0005604791 python3.9[222259]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:57:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:24.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:57:24 np0005604791 python3.9[222442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7280016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 04:57:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:57:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:57:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 04:57:25 np0005604791 python3.9[222565]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1770026244.4763923-2976-137110893292462/.source _original_basename=.3j3jpy7v follow=False checksum=2ad71deba695918c6972b05c522c472e308b3cb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb  2 04:57:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730001570 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:26.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:26.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:26 np0005604791 python3.9[222717]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:27 np0005604791 python3.9[222870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7280016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:27 np0005604791 python3.9[222991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026246.858164-3054-69020391126883/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aafdeb4849f80b4aa3d95767e2f1397576892cd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:28.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:57:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:28.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:57:28 np0005604791 python3.9[223141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb  2 04:57:29 np0005604791 python3.9[223263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026248.133443-3099-9849263248571/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=a1f1b826d995a314b6b973b7452c5ae4777408c1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb  2 04:57:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:29 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:57:29 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 04:57:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:30.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:30 np0005604791 python3.9[223440]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Feb  2 04:57:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:30.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:31 np0005604791 python3.9[223593]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb  2 04:57:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.063641) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252063668, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1277, "num_deletes": 254, "total_data_size": 3111561, "memory_usage": 3137440, "flush_reason": "Manual Compaction"}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252077974, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2037435, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18468, "largest_seqno": 19740, "table_properties": {"data_size": 2031933, "index_size": 2897, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11090, "raw_average_key_size": 18, "raw_value_size": 2021027, "raw_average_value_size": 3390, "num_data_blocks": 130, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026143, "oldest_key_time": 1770026143, "file_creation_time": 1770026252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14401 microseconds, and 4096 cpu microseconds.
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.078035) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2037435 bytes OK
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.078058) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081886) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081909) EVENT_LOG_v1 {"time_micros": 1770026252081902, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081927) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3105476, prev total WAL file size 3105476, number of live WAL files 2.
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.082742) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1989KB)], [33(11MB)]
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252082798, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13736802, "oldest_snapshot_seqno": -1}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4980 keys, 13234698 bytes, temperature: kUnknown
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252172105, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13234698, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13200221, "index_size": 20927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126574, "raw_average_key_size": 25, "raw_value_size": 13108586, "raw_average_value_size": 2632, "num_data_blocks": 858, "num_entries": 4980, "num_filter_entries": 4980, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.172319) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13234698 bytes
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.179977) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.7 rd, 148.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.2 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.2) write-amplify(6.5) OK, records in: 5502, records dropped: 522 output_compression: NoCompression
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.180012) EVENT_LOG_v1 {"time_micros": 1770026252179998, "job": 18, "event": "compaction_finished", "compaction_time_micros": 89384, "compaction_time_cpu_micros": 36350, "output_level": 6, "num_output_files": 1, "total_output_size": 13234698, "num_input_records": 5502, "num_output_records": 4980, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252180306, "job": 18, "event": "table_file_deletion", "file_number": 35}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252181062, "job": 18, "event": "table_file_deletion", "file_number": 33}
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.082626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181103) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 04:57:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:32.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:32.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:32 np0005604791 python3[223745]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb  2 04:57:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:34.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:34.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:36.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:38.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:38.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:40.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:40.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:57:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:42.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:57:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:42.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:43 np0005604791 podman[223847]: 2026-02-02 09:57:43.01526774 +0000 UTC m=+2.685890834 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb  2 04:57:43 np0005604791 podman[223760]: 2026-02-02 09:57:43.046254751 +0000 UTC m=+10.472975788 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb  2 04:57:43 np0005604791 podman[223898]: 2026-02-02 09:57:43.245603793 +0000 UTC m=+0.079862041 container create f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb  2 04:57:43 np0005604791 podman[223898]: 2026-02-02 09:57:43.202520323 +0000 UTC m=+0.036778581 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb  2 04:57:43 np0005604791 python3[223745]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb  2 04:57:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:44 np0005604791 python3.9[224088]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:44.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:44.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.897 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 04:57:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.898 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 04:57:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.898 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 04:57:45 np0005604791 python3.9[224243]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Feb  2 04:57:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:46.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:46.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:46 np0005604791 python3.9[224395]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb  2 04:57:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:47 np0005604791 python3[224548]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb  2 04:57:47 np0005604791 podman[224584]: 2026-02-02 09:57:47.591649918 +0000 UTC m=+0.056420992 container create 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true)
Feb  2 04:57:47 np0005604791 podman[224584]: 2026-02-02 09:57:47.565284344 +0000 UTC m=+0.030055418 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb  2 04:57:47 np0005604791 python3[224548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83 kolla_start
Feb  2 04:57:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:48.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:48.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:48 np0005604791 python3.9[224773]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:49 np0005604791 podman[224928]: 2026-02-02 09:57:49.318032032 +0000 UTC m=+0.070994895 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb  2 04:57:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:49 np0005604791 python3.9[224929]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:57:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:50 np0005604791 python3.9[225098]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770026269.540563-3387-101090893348406/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb  2 04:57:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:50.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:50 np0005604791 python3.9[225174]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb  2 04:57:50 np0005604791 systemd[1]: Reloading.
Feb  2 04:57:50 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:57:50 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:57:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:51 np0005604791 python3.9[225286]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb  2 04:57:51 np0005604791 systemd[1]: Reloading.
Feb  2 04:57:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:51 np0005604791 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Feb  2 04:57:51 np0005604791 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb  2 04:57:51 np0005604791 systemd[1]: Starting nova_compute container...
Feb  2 04:57:51 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:57:51 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:51 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:51 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:51 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:51 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:51 np0005604791 podman[225326]: 2026-02-02 09:57:51.993451697 +0000 UTC m=+0.123134736 container init 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb  2 04:57:52 np0005604791 podman[225326]: 2026-02-02 09:57:52.000974309 +0000 UTC m=+0.130657348 container start 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb  2 04:57:52 np0005604791 podman[225326]: nova_compute
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + sudo -E kolla_set_configs
Feb  2 04:57:52 np0005604791 systemd[1]: Started nova_compute container.
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Validating config file
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying service configuration files
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Deleting /etc/ceph
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Creating directory /etc/ceph
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Writing out command to execute
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:52 np0005604791 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb  2 04:57:52 np0005604791 nova_compute[225341]: ++ cat /run_command
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + CMD=nova-compute
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + ARGS=
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + sudo kolla_copy_cacerts
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + [[ ! -n '' ]]
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + . kolla_extend_start
Feb  2 04:57:52 np0005604791 nova_compute[225341]: Running command: 'nova-compute'
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + echo 'Running command: '\''nova-compute'\'''
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + umask 0022
Feb  2 04:57:52 np0005604791 nova_compute[225341]: + exec nova-compute
Feb  2 04:57:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb  2 04:57:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:52.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb  2 04:57:53 np0005604791 python3.9[225503]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:54 np0005604791 python3.9[225657]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:57:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:54.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:57:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:54.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.644 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.791 225345 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.815 225345 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 04:57:54 np0005604791 nova_compute[225341]: 2026-02-02 09:57:54.815 225345 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Feb  2 04:57:54 np0005604791 python3.9[225811]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb  2 04:57:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.461 225345 INFO nova.virt.driver [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.610 225345 INFO nova.compute.provider_config [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Feb  2 04:57:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc740000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.666 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.667 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.668 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.669 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.669 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.670 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.670 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.672 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.672 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.673 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.673 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.675 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.675 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.676 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.676 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.677 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.677 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.678 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.678 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.679 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.679 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.681 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.681 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.682 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.682 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.683 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.684 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.684 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.685 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.685 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.686 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.686 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.687 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.688 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.688 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.689 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.690 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.690 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.692 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.692 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.695 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.695 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.706 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.706 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.708 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.708 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.746 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.746 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 WARNING oslo_config.cfg [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb  2 04:57:55 np0005604791 nova_compute[225341]: live_migration_uri is deprecated for removal in favor of two other options that
Feb  2 04:57:55 np0005604791 nova_compute[225341]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb  2 04:57:55 np0005604791 nova_compute[225341]: and ``live_migration_inbound_addr`` respectively.
Feb  2 04:57:55 np0005604791 nova_compute[225341]: ).  Its value may be silently ignored in the future.#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_secret_uuid        = d241d473-9fcb-5f74-b163-f1ca4454e7f1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.857 225345 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.877 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.877 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Feb  2 04:57:55 np0005604791 systemd[1]: Starting libvirt QEMU daemon...
Feb  2 04:57:55 np0005604791 systemd[1]: Started libvirt QEMU daemon.
Feb  2 04:57:55 np0005604791 python3.9[225966]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.933 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7a452816d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.936 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7a452816d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.936 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Connection event '1' reason 'None'#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.966 225345 WARNING nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Feb  2 04:57:55 np0005604791 nova_compute[225341]: 2026-02-02 09:57:55.966 225345 DEBUG nova.virt.libvirt.volume.mount [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Feb  2 04:57:56 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:57:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:56.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:56.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.755 225345 INFO nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host capabilities <capabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <host>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <uuid>7f778d97-f318-4380-8776-2e4d99e5fd86</uuid>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <arch>x86_64</arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model>EPYC-Rome-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <vendor>AMD</vendor>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <microcode version='16777317'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <signature family='23' model='49' stepping='0'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <maxphysaddr mode='emulate' bits='40'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='x2apic'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='tsc-deadline'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='osxsave'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='hypervisor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='tsc_adjust'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='spec-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='stibp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='arch-capabilities'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='cmp_legacy'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='topoext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='virt-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='lbrv'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='tsc-scale'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='vmcb-clean'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='pause-filter'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='pfthreshold'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='svme-addr-chk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='rdctl-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='skip-l1dfl-vmentry'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='mds-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature name='pschange-mc-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <pages unit='KiB' size='4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <pages unit='KiB' size='2048'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <pages unit='KiB' size='1048576'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <power_management>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <suspend_mem/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </power_management>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <iommu support='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <migration_features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <live/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <uri_transports>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <uri_transport>tcp</uri_transport>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <uri_transport>rdma</uri_transport>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </uri_transports>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </migration_features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <topology>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <cells num='1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <cell id='0'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <memory unit='KiB'>7864292</memory>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <pages unit='KiB' size='4'>1966073</pages>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <pages unit='KiB' size='2048'>0</pages>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <pages unit='KiB' size='1048576'>0</pages>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <distances>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <sibling id='0' value='10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          </distances>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          <cpus num='8'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:          </cpus>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        </cell>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </cells>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </topology>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <cache>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </cache>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <secmodel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model>selinux</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <doi>0</doi>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </secmodel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <secmodel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model>dac</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <doi>0</doi>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <baselabel type='kvm'>+107:+107</baselabel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <baselabel type='qemu'>+107:+107</baselabel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </secmodel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </host>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <guest>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <os_type>hvm</os_type>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <arch name='i686'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <wordsize>32</wordsize>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <domain type='qemu'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <domain type='kvm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <pae/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <nonpae/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <acpi default='on' toggle='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <apic default='on' toggle='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <cpuselection/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <deviceboot/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <disksnapshot default='on' toggle='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <externalSnapshot/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </guest>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <guest>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <os_type>hvm</os_type>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <arch name='x86_64'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <wordsize>64</wordsize>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <domain type='qemu'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <domain type='kvm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <acpi default='on' toggle='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <apic default='on' toggle='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <cpuselection/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <deviceboot/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <disksnapshot default='on' toggle='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <externalSnapshot/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </guest>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 
Feb  2 04:57:56 np0005604791 nova_compute[225341]: </capabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: #033[00m
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.765 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.792 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb  2 04:57:56 np0005604791 nova_compute[225341]: <domainCapabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <domain>kvm</domain>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <machine>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <arch>i686</arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <vcpu max='240'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <iothreads supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <os supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <enum name='firmware'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <loader supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>rom</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pflash</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='readonly'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>yes</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='secure'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </loader>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </os>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='maximum' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='maximumMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-model' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <vendor>AMD</vendor>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='x2apic'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='stibp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='succor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lbrv'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='custom' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Dhyana-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='athlon'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='athlon-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='core2duo'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='core2duo-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='coreduo'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='coreduo-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='n270'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='n270-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='phenom'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='phenom-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <memoryBacking supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <enum name='sourceType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>file</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>anonymous</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>memfd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </memoryBacking>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <devices>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <disk supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='diskDevice'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>disk</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>cdrom</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>floppy</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>lun</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>ide</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>fdc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>sata</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </disk>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <graphics supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vnc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>egl-headless</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </graphics>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <video supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='modelType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vga</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>cirrus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>none</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>bochs</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>ramfb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </video>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <hostdev supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='mode'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>subsystem</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='startupPolicy'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>mandatory</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>requisite</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>optional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='subsysType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pci</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='capsType'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='pciBackend'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </hostdev>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <rng supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>random</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>egd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </rng>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <filesystem supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='driverType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>path</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>handle</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtiofs</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </filesystem>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <tpm supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tpm-tis</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tpm-crb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>emulator</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>external</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendVersion'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>2.0</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </tpm>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <redirdev supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </redirdev>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <channel supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </channel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <crypto supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>qemu</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </crypto>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <interface supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>passt</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </interface>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <panic supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>isa</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>hyperv</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </panic>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <console supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>null</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dev</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>file</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pipe</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>stdio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>udp</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tcp</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>qemu-vdagent</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </console>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </devices>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <gic supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <vmcoreinfo supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <genid supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <backingStoreInput supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <backup supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <async-teardown supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <s390-pv supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <ps2 supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <tdx supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <sev supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <sgx supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <hyperv supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='features'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>relaxed</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vapic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>spinlocks</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vpindex</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>runtime</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>synic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>stimer</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>reset</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vendor_id</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>frequencies</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>reenlightenment</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tlbflush</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>ipi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>avic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>emsr_bitmap</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>xmm_input</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <defaults>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <spinlocks>4095</spinlocks>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <stimer_direct>on</stimer_direct>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </defaults>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </hyperv>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <launchSecurity supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: </domainCapabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.801 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb  2 04:57:56 np0005604791 nova_compute[225341]: <domainCapabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <domain>kvm</domain>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <machine>pc-q35-rhel9.8.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <arch>i686</arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <vcpu max='4096'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <iothreads supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <os supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <enum name='firmware'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <loader supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>rom</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pflash</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='readonly'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>yes</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='secure'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </loader>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </os>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='maximum' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='maximumMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-model' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <vendor>AMD</vendor>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='x2apic'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='stibp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='succor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lbrv'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='custom' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Dhyana-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='athlon'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='athlon-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='core2duo'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='core2duo-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='coreduo'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='coreduo-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='n270'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='n270-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='phenom'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='phenom-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <memoryBacking supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <enum name='sourceType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>file</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>anonymous</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>memfd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </memoryBacking>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <devices>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <disk supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='diskDevice'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>disk</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>cdrom</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>floppy</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>lun</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>fdc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>sata</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </disk>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <graphics supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vnc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>egl-headless</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </graphics>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <video supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='modelType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vga</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>cirrus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>none</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>bochs</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>ramfb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </video>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <hostdev supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='mode'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>subsystem</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='startupPolicy'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>mandatory</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>requisite</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>optional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='subsysType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pci</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='capsType'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='pciBackend'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </hostdev>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <rng supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>random</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>egd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </rng>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <filesystem supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='driverType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>path</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>handle</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>virtiofs</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </filesystem>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <tpm supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tpm-tis</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tpm-crb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>emulator</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>external</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendVersion'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>2.0</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </tpm>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <redirdev supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </redirdev>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <channel supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </channel>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <crypto supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>qemu</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </crypto>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <interface supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='backendType'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>passt</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </interface>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <panic supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>isa</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>hyperv</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </panic>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <console supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>null</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vc</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dev</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>file</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pipe</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>stdio</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>udp</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tcp</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>qemu-vdagent</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </console>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </devices>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <gic supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <vmcoreinfo supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <genid supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <backingStoreInput supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <backup supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <async-teardown supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <s390-pv supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <ps2 supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <tdx supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <sev supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <sgx supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <hyperv supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='features'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>relaxed</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vapic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>spinlocks</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vpindex</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>runtime</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>synic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>stimer</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>reset</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>vendor_id</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>frequencies</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>reenlightenment</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>tlbflush</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>ipi</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>avic</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>emsr_bitmap</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>xmm_input</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <defaults>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <spinlocks>4095</spinlocks>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <stimer_direct>on</stimer_direct>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </defaults>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </hyperv>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <launchSecurity supported='no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </features>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: </domainCapabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.874 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Feb  2 04:57:56 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb  2 04:57:56 np0005604791 nova_compute[225341]: <domainCapabilities>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <domain>kvm</domain>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <machine>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <arch>x86_64</arch>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <vcpu max='240'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <iothreads supported='yes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <os supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <enum name='firmware'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <loader supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>rom</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>pflash</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='readonly'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>yes</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='secure'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </loader>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  </os>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:  <cpu>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='maximum' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <enum name='maximumMigratable'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='host-model' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <vendor>AMD</vendor>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='x2apic'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='stibp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='succor'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lbrv'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:    <mode name='custom' supported='yes'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Dhyana-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:56 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='athlon'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='athlon-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='core2duo'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='core2duo-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='coreduo'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='coreduo-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='n270'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='n270-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='phenom'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='phenom-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </cpu>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <memoryBacking supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <enum name='sourceType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>file</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>anonymous</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>memfd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </memoryBacking>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <devices>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <disk supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='diskDevice'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>disk</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>cdrom</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>floppy</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>lun</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>ide</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>fdc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>sata</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </disk>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <graphics supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vnc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>egl-headless</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </graphics>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <video supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='modelType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vga</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>cirrus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>none</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>bochs</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>ramfb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </video>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <hostdev supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='mode'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>subsystem</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='startupPolicy'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>mandatory</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>requisite</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>optional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='subsysType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pci</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='capsType'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='pciBackend'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </hostdev>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <rng supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>random</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>egd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </rng>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <filesystem supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='driverType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>path</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>handle</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtiofs</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </filesystem>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <tpm supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tpm-tis</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tpm-crb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>emulator</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>external</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendVersion'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>2.0</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </tpm>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <redirdev supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </redirdev>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <channel supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </channel>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <crypto supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>qemu</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </crypto>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <interface supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>passt</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </interface>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <panic supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>isa</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>hyperv</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </panic>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <console supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>null</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dev</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>file</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pipe</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>stdio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>udp</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tcp</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>qemu-vdagent</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </console>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </devices>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <features>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <gic supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <vmcoreinfo supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <genid supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <backingStoreInput supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <backup supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <async-teardown supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <s390-pv supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <ps2 supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <tdx supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <sev supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <sgx supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <hyperv supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='features'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>relaxed</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vapic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>spinlocks</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vpindex</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>runtime</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>synic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>stimer</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>reset</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vendor_id</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>frequencies</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>reenlightenment</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tlbflush</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>ipi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>avic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>emsr_bitmap</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>xmm_input</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <defaults>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <spinlocks>4095</spinlocks>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <stimer_direct>on</stimer_direct>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </defaults>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </hyperv>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <launchSecurity supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </features>
Feb  2 04:57:57 np0005604791 nova_compute[225341]: </domainCapabilities>
Feb  2 04:57:57 np0005604791 nova_compute[225341]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.943 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb  2 04:57:57 np0005604791 nova_compute[225341]: <domainCapabilities>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <domain>kvm</domain>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <machine>pc-q35-rhel9.8.0</machine>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <arch>x86_64</arch>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <vcpu max='4096'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <iothreads supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <os supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <enum name='firmware'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>efi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <loader supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>rom</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pflash</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='readonly'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>yes</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='secure'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>yes</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>no</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </loader>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </os>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <cpu>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <mode name='maximum' supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='maximumMigratable'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>on</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>off</value>
Feb  2 04:57:57 np0005604791 python3.9[226200]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <mode name='host-model' supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <vendor>AMD</vendor>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='x2apic'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='stibp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='ssbd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='succor'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='lbrv'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <mode name='custom' supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Broadwell-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ddpd-u'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sha512'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sm3'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sm4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Cooperlake-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Denverton'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Denverton-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Dhyana-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amd-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='auto-ibrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibpb-brtype'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='no-nested-data-bp'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='null-sel-clr-base'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='perfmon-v2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbpb'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='stibp-always-on'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='EPYC-v5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-128'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-256'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx10-512'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='prefetchiti'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Haswell-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='IvyBridge-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='KnightsMill-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-4fmaps'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-4vnniw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512er'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512pf'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fma4'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tbm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xop'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 systemd[1]: Stopping nova_compute container...
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='amx-tile'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-bf16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-fp16'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bitalg'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vbmi2'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrc'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fzrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='la57'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='taa-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='tsx-ldtrk'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SierraForest'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='SierraForest-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ifma'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-ne-convert'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx-vnni-int8'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bhi-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='bus-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cmpccxadd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fbsdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='fsrs'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ibrs-all'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='intel-psfd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ipred-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='lam'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mcdt-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pbrsb-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='psdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rrsba-ctrl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='serialize'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vaes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='vpclmulqdq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='hle'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='rtm'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512bw'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512cd'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512dq'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512f'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='avx512vl'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='invpcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pcid'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='pku'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='mpx'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v2'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v3'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='core-capability'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='split-lock-detect'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='Snowridge-v4'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='cldemote'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='erms'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='gfni'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdir64b'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='movdiri'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='xsaves'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='athlon'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='athlon-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='core2duo'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='core2duo-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='coreduo'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='coreduo-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='n270'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='n270-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='ss'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='phenom'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <blockers model='phenom-v1'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnow'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <feature name='3dnowext'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </blockers>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </mode>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </cpu>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <memoryBacking supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <enum name='sourceType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>file</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>anonymous</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <value>memfd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </memoryBacking>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <devices>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <disk supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='diskDevice'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>disk</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>cdrom</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>floppy</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>lun</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>fdc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>sata</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </disk>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <graphics supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vnc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>egl-headless</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </graphics>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <video supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='modelType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vga</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>cirrus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>none</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>bochs</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>ramfb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </video>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <hostdev supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='mode'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>subsystem</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='startupPolicy'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>mandatory</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>requisite</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>optional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='subsysType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pci</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>scsi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='capsType'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='pciBackend'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </hostdev>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <rng supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtio-non-transitional</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>random</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>egd</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </rng>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <filesystem supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='driverType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>path</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>handle</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>virtiofs</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </filesystem>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <tpm supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tpm-tis</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tpm-crb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>emulator</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>external</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendVersion'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>2.0</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </tpm>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <redirdev supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='bus'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>usb</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </redirdev>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <channel supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </channel>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <crypto supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>qemu</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendModel'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>builtin</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </crypto>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <interface supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='backendType'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>default</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>passt</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </interface>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <panic supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='model'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>isa</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>hyperv</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </panic>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <console supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='type'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>null</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vc</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pty</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dev</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>file</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>pipe</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>stdio</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>udp</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tcp</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>unix</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>qemu-vdagent</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>dbus</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </console>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </devices>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  <features>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <gic supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <vmcoreinfo supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <genid supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <backingStoreInput supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <backup supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <async-teardown supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <s390-pv supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <ps2 supported='yes'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <tdx supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <sev supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <sgx supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <hyperv supported='yes'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <enum name='features'>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>relaxed</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vapic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>spinlocks</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vpindex</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>runtime</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>synic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>stimer</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>reset</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>vendor_id</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>frequencies</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>reenlightenment</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>tlbflush</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>ipi</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>avic</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>emsr_bitmap</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <value>xmm_input</value>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </enum>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      <defaults>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <spinlocks>4095</spinlocks>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <stimer_direct>on</stimer_direct>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:      </defaults>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    </hyperv>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:    <launchSecurity supported='no'/>
Feb  2 04:57:57 np0005604791 nova_compute[225341]:  </features>
Feb  2 04:57:57 np0005604791 nova_compute[225341]: </domainCapabilities>
Feb  2 04:57:57 np0005604791 nova_compute[225341]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.998 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.999 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:56.999 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.003 225345 INFO nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Secure Boot support detected#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.005 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.005 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.013 225345 DEBUG nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.064 225345 INFO nova.virt.node [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Determined node identity 8e32c057-ad28-4c19-8374-763e0c1c8622 from /var/lib/nova/compute_id#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.082 225345 WARNING nova.compute.manager [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Compute nodes ['8e32c057-ad28-4c19-8374-763e0c1c8622'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 04:57:57 np0005604791 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 04:57:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:57:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:57 np0005604791 virtqemud[225988]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Feb  2 04:57:57 np0005604791 virtqemud[225988]: hostname: compute-1
Feb  2 04:57:57 np0005604791 virtqemud[225988]: End of file while reading data: Input/output error
Feb  2 04:57:57 np0005604791 systemd[1]: libpod-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e.scope: Deactivated successfully.
Feb  2 04:57:57 np0005604791 systemd[1]: libpod-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e.scope: Consumed 3.065s CPU time.
Feb  2 04:57:57 np0005604791 podman[226234]: 2026-02-02 09:57:57.505006021 +0000 UTC m=+0.462728510 container died 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Feb  2 04:57:57 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e-userdata-shm.mount: Deactivated successfully.
Feb  2 04:57:57 np0005604791 systemd[1]: var-lib-containers-storage-overlay-ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154-merged.mount: Deactivated successfully.
Feb  2 04:57:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000052s ======
Feb  2 04:57:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:58.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Feb  2 04:57:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:57:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:57:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:58.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:57:58 np0005604791 podman[226234]: 2026-02-02 09:57:58.648371276 +0000 UTC m=+1.606093735 container cleanup 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 04:57:58 np0005604791 podman[226234]: nova_compute
Feb  2 04:57:58 np0005604791 podman[226263]: nova_compute
Feb  2 04:57:58 np0005604791 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb  2 04:57:58 np0005604791 systemd[1]: Stopped nova_compute container.
Feb  2 04:57:58 np0005604791 systemd[1]: Starting nova_compute container...
Feb  2 04:57:58 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:57:58 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:58 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:58 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:58 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:58 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb  2 04:57:58 np0005604791 podman[226276]: 2026-02-02 09:57:58.891322631 +0000 UTC m=+0.140103349 container init 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Feb  2 04:57:58 np0005604791 podman[226276]: 2026-02-02 09:57:58.899227483 +0000 UTC m=+0.148008191 container start 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb  2 04:57:58 np0005604791 nova_compute[226294]: + sudo -E kolla_set_configs
Feb  2 04:57:58 np0005604791 podman[226276]: nova_compute
Feb  2 04:57:58 np0005604791 systemd[1]: Started nova_compute container.
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Validating config file
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying service configuration files
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /etc/ceph
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Creating directory /etc/ceph
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb  2 04:57:58 np0005604791 nova_compute[226294]: INFO:__main__:Writing out command to execute
Feb  2 04:57:59 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb  2 04:57:59 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb  2 04:57:59 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb  2 04:57:59 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:57:59 np0005604791 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb  2 04:57:59 np0005604791 nova_compute[226294]: ++ cat /run_command
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + CMD=nova-compute
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + ARGS=
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + sudo kolla_copy_cacerts
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + [[ ! -n '' ]]
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + . kolla_extend_start
Feb  2 04:57:59 np0005604791 nova_compute[226294]: Running command: 'nova-compute'
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + echo 'Running command: '\''nova-compute'\'''
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + umask 0022
Feb  2 04:57:59 np0005604791 nova_compute[226294]: + exec nova-compute
Feb  2 04:57:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:59 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720001b20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb  2 04:57:59 np0005604791 kernel: ganesha.nfsd[221734]: segfault at 50 ip 00007fc7c990232e sp 00007fc7357f9210 error 4 in libntirpc.so.5.8[7fc7c98e7000+2c000] likely on CPU 6 (core 0, socket 6)
Feb  2 04:57:59 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 04:57:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:59 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy ignored for local
Feb  2 04:57:59 np0005604791 systemd[1]: Started Process Core Dump (PID 226330/UID 0).
Feb  2 04:58:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:58:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 04:58:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:00.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 04:58:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:58:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:58:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:58:00 np0005604791 systemd-coredump[226331]: Process 219567 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007fc7c990232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fc7c990c900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Feb  2 04:58:00 np0005604791 python3.9[226460]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb  2 04:58:00 np0005604791 systemd[1]: systemd-coredump@8-226330-0.service: Deactivated successfully.
Feb  2 04:58:00 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 04:58:00 np0005604791 podman[226467]: 2026-02-02 09:58:00.687019155 +0000 UTC m=+0.030540091 container died 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True)
Feb  2 04:58:00 np0005604791 systemd[1]: var-lib-containers-storage-overlay-73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171-merged.mount: Deactivated successfully.
Feb  2 04:58:00 np0005604791 podman[226467]: 2026-02-02 09:58:00.818889644 +0000 UTC m=+0.162410540 container remove 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb  2 04:58:00 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 04:58:00 np0005604791 systemd[1]: Started libpod-conmon-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope.
Feb  2 04:58:00 np0005604791 systemd[1]: Started libcrun container.
Feb  2 04:58:00 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb  2 04:58:00 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb  2 04:58:00 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb  2 04:58:00 np0005604791 podman[226503]: 2026-02-02 09:58:00.932343372 +0000 UTC m=+0.190035935 container init f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb  2 04:58:00 np0005604791 podman[226503]: 2026-02-02 09:58:00.940792947 +0000 UTC m=+0.198485460 container start f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 04:58:00 np0005604791 python3.9[226460]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Applying nova statedir ownership
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb  2 04:58:00 np0005604791 nova_compute_init[226539]: INFO:nova_statedir:Nova statedir ownership complete
Feb  2 04:58:01 np0005604791 systemd[1]: libpod-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope: Deactivated successfully.
Feb  2 04:58:01 np0005604791 podman[226541]: 2026-02-02 09:58:01.019104298 +0000 UTC m=+0.047422043 container died f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm)
Feb  2 04:58:01 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 04:58:01 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.206s CPU time.
Feb  2 04:58:01 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a-userdata-shm.mount: Deactivated successfully.
Feb  2 04:58:01 np0005604791 systemd[1]: var-lib-containers-storage-overlay-ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1-merged.mount: Deactivated successfully.
Feb  2 04:58:01 np0005604791 podman[226563]: 2026-02-02 09:58:01.080510786 +0000 UTC m=+0.066314715 container cleanup f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 04:58:01 np0005604791 systemd[1]: libpod-conmon-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope: Deactivated successfully.
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.484 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.617 226298 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.637 226298 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 04:58:01 np0005604791 nova_compute[226294]: 2026-02-02 09:58:01.638 226298 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Feb  2 04:58:01 np0005604791 systemd[1]: session-53.scope: Deactivated successfully.
Feb  2 04:58:01 np0005604791 systemd[1]: session-53.scope: Consumed 1min 54.117s CPU time.
Feb  2 04:58:01 np0005604791 systemd-logind[805]: Session 53 logged out. Waiting for processes to exit.
Feb  2 04:58:01 np0005604791 systemd-logind[805]: Removed session 53.
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.070 226298 INFO nova.virt.driver [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.225 226298 INFO nova.compute.provider_config [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.243 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.243 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:58:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:02.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 WARNING oslo_config.cfg [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb  2 04:58:02 np0005604791 nova_compute[226294]: live_migration_uri is deprecated for removal in favor of two other options that
Feb  2 04:58:02 np0005604791 nova_compute[226294]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb  2 04:58:02 np0005604791 nova_compute[226294]: and ``live_migration_inbound_addr`` respectively.
Feb  2 04:58:02 np0005604791 nova_compute[226294]: ).  Its value may be silently ignored in the future.#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_secret_uuid        = d241d473-9fcb-5f74-b163-f1ca4454e7f1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 04:58:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:02.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.384 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.385 226298 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.399 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.399 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.400 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.400 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.410 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f858bc1b610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.412 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f858bc1b610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.412 226298 INFO nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Connection event '1' reason 'None'#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.420 226298 INFO nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host capabilities <capabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <host>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <uuid>7f778d97-f318-4380-8776-2e4d99e5fd86</uuid>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <arch>x86_64</arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model>EPYC-Rome-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <vendor>AMD</vendor>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <microcode version='16777317'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <signature family='23' model='49' stepping='0'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <maxphysaddr mode='emulate' bits='40'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='x2apic'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='tsc-deadline'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='osxsave'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='hypervisor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='tsc_adjust'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='spec-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='stibp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='arch-capabilities'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='cmp_legacy'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='topoext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='virt-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='lbrv'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='tsc-scale'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='vmcb-clean'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='pause-filter'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='pfthreshold'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='svme-addr-chk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='rdctl-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='skip-l1dfl-vmentry'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='mds-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature name='pschange-mc-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <pages unit='KiB' size='4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <pages unit='KiB' size='2048'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <pages unit='KiB' size='1048576'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <power_management>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <suspend_mem/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </power_management>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <iommu support='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <migration_features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <live/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <uri_transports>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <uri_transport>tcp</uri_transport>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <uri_transport>rdma</uri_transport>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </uri_transports>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </migration_features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <topology>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <cells num='1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <cell id='0'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <memory unit='KiB'>7864292</memory>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <pages unit='KiB' size='4'>1966073</pages>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <pages unit='KiB' size='2048'>0</pages>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <pages unit='KiB' size='1048576'>0</pages>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <distances>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <sibling id='0' value='10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          </distances>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          <cpus num='8'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:          </cpus>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        </cell>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </cells>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </topology>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <cache>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </cache>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <secmodel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model>selinux</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <doi>0</doi>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </secmodel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <secmodel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model>dac</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <doi>0</doi>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <baselabel type='kvm'>+107:+107</baselabel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <baselabel type='qemu'>+107:+107</baselabel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </secmodel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </host>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <guest>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <os_type>hvm</os_type>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <arch name='i686'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <wordsize>32</wordsize>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <domain type='qemu'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <domain type='kvm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <pae/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <nonpae/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <acpi default='on' toggle='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <apic default='on' toggle='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <cpuselection/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <deviceboot/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <disksnapshot default='on' toggle='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <externalSnapshot/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </guest>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <guest>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <os_type>hvm</os_type>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <arch name='x86_64'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <wordsize>64</wordsize>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <domain type='qemu'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <domain type='kvm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <acpi default='on' toggle='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <apic default='on' toggle='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <cpuselection/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <deviceboot/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <disksnapshot default='on' toggle='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <externalSnapshot/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </guest>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 
Feb  2 04:58:02 np0005604791 nova_compute[226294]: </capabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: #033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.427 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.432 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb  2 04:58:02 np0005604791 nova_compute[226294]: <domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <domain>kvm</domain>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <machine>pc-q35-rhel9.8.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <arch>i686</arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <vcpu max='4096'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <iothreads supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <os supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='firmware'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <loader supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>rom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pflash</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='readonly'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>yes</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='secure'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </loader>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </os>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='maximum' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='maximumMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-model' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <vendor>AMD</vendor>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='x2apic'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='stibp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='succor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lbrv'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='custom' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Dhyana-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <memoryBacking supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='sourceType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>anonymous</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>memfd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </memoryBacking>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <disk supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='diskDevice'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>disk</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cdrom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>floppy</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>lun</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>fdc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>sata</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </disk>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <graphics supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vnc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egl-headless</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <video supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='modelType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vga</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cirrus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>none</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>bochs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ramfb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </video>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hostdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='mode'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>subsystem</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='startupPolicy'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>mandatory</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>requisite</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>optional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='subsysType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pci</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='capsType'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='pciBackend'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hostdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <rng supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>random</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </rng>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <filesystem supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='driverType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>path</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>handle</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtiofs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </filesystem>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tpm supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-tis</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-crb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emulator</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>external</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendVersion'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>2.0</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </tpm>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <redirdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </redirdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <channel supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </channel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <crypto supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </crypto>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <interface supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>passt</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </interface>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <panic supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>isa</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>hyperv</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </panic>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <console supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>null</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dev</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pipe</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stdio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>udp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tcp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu-vdagent</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </console>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <gic supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <vmcoreinfo supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <genid supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backingStoreInput supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backup supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <async-teardown supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <s390-pv supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <ps2 supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tdx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sev supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sgx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hyperv supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='features'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>relaxed</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vapic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>spinlocks</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vpindex</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>runtime</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>synic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stimer</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reset</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vendor_id</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>frequencies</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reenlightenment</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tlbflush</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ipi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>avic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emsr_bitmap</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>xmm_input</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <spinlocks>4095</spinlocks>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <stimer_direct>on</stimer_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hyperv>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <launchSecurity supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: </domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.436 226298 WARNING nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.437 226298 DEBUG nova.virt.libvirt.volume.mount [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.447 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb  2 04:58:02 np0005604791 nova_compute[226294]: <domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <domain>kvm</domain>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <machine>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <arch>i686</arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <vcpu max='240'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <iothreads supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <os supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='firmware'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <loader supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>rom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pflash</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='readonly'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>yes</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='secure'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </loader>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </os>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='maximum' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='maximumMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-model' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <vendor>AMD</vendor>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='x2apic'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='stibp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='succor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lbrv'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='custom' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Dhyana-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <memoryBacking supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='sourceType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>anonymous</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>memfd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </memoryBacking>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <disk supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='diskDevice'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>disk</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cdrom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>floppy</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>lun</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ide</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>fdc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>sata</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </disk>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <graphics supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vnc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egl-headless</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <video supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='modelType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vga</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cirrus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>none</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>bochs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ramfb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </video>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hostdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='mode'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>subsystem</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='startupPolicy'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>mandatory</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>requisite</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>optional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='subsysType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pci</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='capsType'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='pciBackend'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hostdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <rng supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>random</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </rng>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <filesystem supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='driverType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>path</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>handle</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtiofs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </filesystem>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tpm supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-tis</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-crb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emulator</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>external</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendVersion'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>2.0</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </tpm>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <redirdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </redirdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <channel supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </channel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <crypto supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </crypto>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <interface supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>passt</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </interface>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <panic supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>isa</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>hyperv</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </panic>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <console supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>null</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dev</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pipe</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stdio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>udp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tcp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu-vdagent</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </console>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <gic supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <vmcoreinfo supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <genid supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backingStoreInput supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backup supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <async-teardown supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <s390-pv supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <ps2 supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tdx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sev supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sgx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hyperv supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='features'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>relaxed</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vapic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>spinlocks</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vpindex</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>runtime</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>synic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stimer</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reset</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vendor_id</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>frequencies</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reenlightenment</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tlbflush</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ipi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>avic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emsr_bitmap</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>xmm_input</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <spinlocks>4095</spinlocks>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <stimer_direct>on</stimer_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hyperv>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <launchSecurity supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: </domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.500 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.504 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb  2 04:58:02 np0005604791 nova_compute[226294]: <domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <domain>kvm</domain>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <machine>pc-q35-rhel9.8.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <arch>x86_64</arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <vcpu max='4096'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <iothreads supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <os supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='firmware'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>efi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <loader supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>rom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pflash</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='readonly'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>yes</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='secure'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>yes</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </loader>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </os>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='maximum' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='maximumMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-model' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <vendor>AMD</vendor>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='x2apic'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='stibp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='succor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lbrv'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='custom' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Denverton-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Dhyana-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Genoa-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Milan-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Rome-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-Turin-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amd-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='auto-ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vp2intersect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fs-gs-base-ns'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibpb-brtype'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='no-nested-data-bp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='null-sel-clr-base'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='perfmon-v2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbpb'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='srso-user-kernel-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='stibp-always-on'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='EPYC-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='GraniteRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-128'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-256'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx10-512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Haswell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v6'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Icelake-Server-v7'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='IvyBridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='KnightsMill-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4fmaps'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-4vnniw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512er'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512pf'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G4-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Opteron_G5-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fma4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tbm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xop'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SapphireRapids-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='amx-tile'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-fp16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-vpopcntdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bitalg'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vbmi2'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrc'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fzrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='la57'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='taa-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='tsx-ldtrk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='SierraForest-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Client-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Skylake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mpx'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='core-capability'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='split-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Snowridge-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='athlon-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='core2duo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='coreduo-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='n270-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='phenom-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnow'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='3dnowext'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <memoryBacking supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='sourceType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>anonymous</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>memfd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </memoryBacking>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <disk supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='diskDevice'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>disk</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cdrom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>floppy</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>lun</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>fdc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>sata</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </disk>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <graphics supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vnc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egl-headless</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <video supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='modelType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vga</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>cirrus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>none</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>bochs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ramfb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </video>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hostdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='mode'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>subsystem</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='startupPolicy'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>mandatory</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>requisite</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>optional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='subsysType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pci</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>scsi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='capsType'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='pciBackend'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hostdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <rng supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtio-non-transitional</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>random</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>egd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </rng>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <filesystem supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='driverType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>path</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>handle</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>virtiofs</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </filesystem>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tpm supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-tis</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tpm-crb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emulator</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>external</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendVersion'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>2.0</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </tpm>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <redirdev supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='bus'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>usb</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </redirdev>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <channel supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </channel>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <crypto supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendModel'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>builtin</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </crypto>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <interface supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='backendType'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>default</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>passt</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </interface>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <panic supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='model'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>isa</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>hyperv</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </panic>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <console supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>null</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vc</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pty</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dev</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>file</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pipe</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stdio</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>udp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tcp</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>unix</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>qemu-vdagent</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>dbus</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </console>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </devices>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <gic supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <vmcoreinfo supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <genid supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backingStoreInput supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <backup supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <async-teardown supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <s390-pv supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <ps2 supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <tdx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sev supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <sgx supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <hyperv supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='features'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>relaxed</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vapic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>spinlocks</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vpindex</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>runtime</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>synic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>stimer</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reset</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>vendor_id</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>frequencies</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>reenlightenment</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>tlbflush</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>ipi</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>avic</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>emsr_bitmap</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>xmm_input</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <spinlocks>4095</spinlocks>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <stimer_direct>on</stimer_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_direct>on</tlbflush_direct>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <tlbflush_extended>on</tlbflush_extended>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <vendor_id>Linux KVM Hv</vendor_id>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </defaults>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </hyperv>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <launchSecurity supported='no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </features>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: </domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Feb  2 04:58:02 np0005604791 nova_compute[226294]: 2026-02-02 09:58:02.590 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb  2 04:58:02 np0005604791 nova_compute[226294]: <domainCapabilities>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <path>/usr/libexec/qemu-kvm</path>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <domain>kvm</domain>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <machine>pc-i440fx-rhel7.6.0</machine>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <arch>x86_64</arch>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <vcpu max='240'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <iothreads supported='yes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <os supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <enum name='firmware'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <loader supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='type'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>rom</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>pflash</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='readonly'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>yes</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='secure'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>no</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </loader>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  </os>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:  <cpu>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-passthrough' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='hostPassthroughMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='maximum' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <enum name='maximumMigratable'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>on</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <value>off</value>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </enum>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='host-model' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model fallback='forbid'>EPYC-Rome</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <vendor>AMD</vendor>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <maxphysaddr mode='passthrough' limit='40'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='x2apic'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-deadline'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='hypervisor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc_adjust'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='spec-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='stibp'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='cmp_legacy'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='overflow-recov'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='succor'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='ibrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='amd-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='virt-ssbd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lbrv'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='tsc-scale'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='vmcb-clean'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='flushbyasid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pause-filter'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='pfthreshold'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='svme-addr-chk'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='require' name='lfence-always-serializing'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <feature policy='disable' name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    </mode>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:    <mode name='custom' supported='yes'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-noTSX-IBRS'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Broadwell-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-noTSX'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v2'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='hle'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rtm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v3'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v4'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cascadelake-Server-v5'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='ClearwaterForest-v1'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ifma'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-ne-convert'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx-vnni-int8'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bhi-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='bus-lock-detect'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cldemote'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='cmpccxadd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ddpd-u'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fbsdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrm'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='fsrs'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='gfni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ibrs-all'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='intel-psfd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='invpcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ipred-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='lam'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='mcdt-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdir64b'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='movdiri'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pbrsb-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pcid'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='pku'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='prefetchiti'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='psdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='rrsba-ctrl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sbdr-ssdp-no'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='serialize'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sha512'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm3'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='sm4'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='ss'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vaes'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='vpclmulqdq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='xsaves'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      </blockers>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:      <blockers model='Cooperlake'>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512-bf16'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512bw'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512cd'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512dq'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512f'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vl'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='avx512vnni'/>
Feb  2 04:58:02 np0005604791 nova_compute[226294]:        <feature name='erms'/>
Feb  2 05:03:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:30 np0005604791 rsyslogd[1009]: imjournal: 4611 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb  2 05:03:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:30.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:30.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:31 np0005604791 nova_compute[226294]: 2026-02-02 10:03:31.322 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:31 np0005604791 nova_compute[226294]: 2026-02-02 10:03:31.582 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5835] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5846] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <warn>  [1770026611.5848] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5866] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5874] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <warn>  [1770026611.5875] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5891] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5904] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5913] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb  2 05:03:31 np0005604791 NetworkManager[49055]: <info>  [1770026611.5920] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb  2 05:03:31 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:31Z|00032|binding|INFO|Releasing lport a2dfb49c-9120-4cbe-a32f-76266c8258fd from this chassis (sb_readonly=0)
Feb  2 05:03:31 np0005604791 nova_compute[226294]: 2026-02-02 10:03:31.602 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:31 np0005604791 nova_compute[226294]: 2026-02-02 10:03:31.615 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:32.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:03:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:32.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:03:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:34 np0005604791 nova_compute[226294]: 2026-02-02 10:03:34.590 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:34.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:34.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:36 np0005604791 nova_compute[226294]: 2026-02-02 10:03:36.323 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:36 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:36Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:4d:0e 10.100.0.25
Feb  2 05:03:36 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:36Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:4d:0e 10.100.0.25
Feb  2 05:03:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:36.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:36.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100336 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:03:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:38.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:38.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:39 np0005604791 nova_compute[226294]: 2026-02-02 10:03:39.628 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:40.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:41 np0005604791 nova_compute[226294]: 2026-02-02 10:03:41.325 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:03:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:42.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:03:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:42.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.612 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.614 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.615 226298 INFO nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Terminating instance#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.617 226298 DEBUG nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Feb  2 05:03:43 np0005604791 kernel: tap6cd44411-fc (unregistering): left promiscuous mode
Feb  2 05:03:43 np0005604791 NetworkManager[49055]: <info>  [1770026623.6714] device (tap6cd44411-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb  2 05:03:43 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:43Z|00033|binding|INFO|Releasing lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 from this chassis (sb_readonly=0)
Feb  2 05:03:43 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:43Z|00034|binding|INFO|Setting lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 down in Southbound
Feb  2 05:03:43 np0005604791 ovn_controller[133666]: 2026-02-02T10:03:43Z|00035|binding|INFO|Removing iface tap6cd44411-fc ovn-installed in OVS
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.677 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.684 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:4d:0e 10.100.0.25'], port_security=['fa:16:3e:7b:4d:0e 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7440c4af-7e45-4796-ac03-ddd1eb035702', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e0f239e2-848d-4af8-8655-52de33d6c78c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef61564f-168a-4d39-98a6-1b124de25a8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=690cea54-b458-4b47-8c01-a695a89a554b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.685 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 in datapath e0f239e2-848d-4af8-8655-52de33d6c78c unbound from our chassis#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.687 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e0f239e2-848d-4af8-8655-52de33d6c78c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.687 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[0295001c-fabd-4952-abaf-59b240f5050f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.688 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c namespace which is not needed anymore#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:43 np0005604791 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb  2 05:03:43 np0005604791 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 12.830s CPU time.
Feb  2 05:03:43 np0005604791 systemd-machined[195072]: Machine qemu-1-instance-00000002 terminated.
Feb  2 05:03:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100343 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:03:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : haproxy version is 2.8.14-c23fe91
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : path to executable is /usr/sbin/haproxy
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : Exiting Master process...
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : Exiting Master process...
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [ALERT]    (229927) : Current worker (229929) exited with code 143 (Terminated)
Feb  2 05:03:43 np0005604791 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : All workers exited. Exiting... (0)
Feb  2 05:03:43 np0005604791 systemd[1]: libpod-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23.scope: Deactivated successfully.
Feb  2 05:03:43 np0005604791 podman[230022]: 2026-02-02 10:03:43.82983318 +0000 UTC m=+0.046244922 container died 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.847 226298 INFO nova.virt.libvirt.driver [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance destroyed successfully.#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.848 226298 DEBUG nova.objects.instance [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 7440c4af-7e45-4796-ac03-ddd1eb035702 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:03:43 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23-userdata-shm.mount: Deactivated successfully.
Feb  2 05:03:43 np0005604791 systemd[1]: var-lib-containers-storage-overlay-0c2211d282dfacd7338273edfd4401280099477e08d1784ec40396df24bf6f8d-merged.mount: Deactivated successfully.
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.862 226298 DEBUG nova.virt.libvirt.vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2085006119',display_name='tempest-TestNetworkBasicOps-server-2085006119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2085006119',id=2,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPTNHym9kIQXfz4suqCv0Mc6jF6e6B0Wqnxt3FW6kpn3GvVkI5IjdsdGcg/l/4S1jaKaNrV8XIFTm81yb+PAyRJlVvfR+xXnnQd/vZvQFfCYnwki1MFmz37JvJeahfBCnw==',key_name='tempest-TestNetworkBasicOps-1317968913',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:03:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-qht7ug2q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:03:23Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=7440c4af-7e45-4796-ac03-ddd1eb035702,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.862 226298 DEBUG nova.network.os_vif_util [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.863 226298 DEBUG nova.network.os_vif_util [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.863 226298 DEBUG os_vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.865 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.865 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cd44411-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:03:43 np0005604791 podman[230022]: 2026-02-02 10:03:43.866342904 +0000 UTC m=+0.082754666 container cleanup 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.867 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.868 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.871 226298 INFO os_vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc')#033[00m
Feb  2 05:03:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:43 np0005604791 systemd[1]: libpod-conmon-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23.scope: Deactivated successfully.
Feb  2 05:03:43 np0005604791 podman[230061]: 2026-02-02 10:03:43.9263933 +0000 UTC m=+0.038902929 container remove 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.929 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[36af9218-dd7a-4acf-93f5-059240575fd2]: (4, ('Mon Feb  2 10:03:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c (32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23)\n32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23\nMon Feb  2 10:03:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c (32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23)\n32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.931 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[bf77539e-a83a-42c6-a3a1-6ba252f8a27b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.932 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0f239e2-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.934 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 kernel: tape0f239e2-80: left promiscuous mode
Feb  2 05:03:43 np0005604791 nova_compute[226294]: 2026-02-02 10:03:43.940 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.942 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[70994b38-efc0-4946-8411-5188e411bfd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.960 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[54181e19-7a0b-42a5-a1c7-76771643ca0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.961 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[229e5397-f2d3-4e0e-99b7-6edcc397e206]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.972 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[d559afe4-4b75-4ea4-b331-8fa8f4366a1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 43124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230097, 'error': None, 'target': 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:43 np0005604791 systemd[1]: run-netns-ovnmeta\x2de0f239e2\x2d848d\x2d4af8\x2d8655\x2d52de33d6c78c.mount: Deactivated successfully.
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.980 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Feb  2 05:03:43 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.980 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[fdad92a5-3c20-47f0-b9cb-714e1f4f9d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.316 226298 INFO nova.virt.libvirt.driver [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deleting instance files /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702_del#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.317 226298 INFO nova.virt.libvirt.driver [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deletion of /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702_del complete#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.381 226298 DEBUG nova.virt.libvirt.host [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.382 226298 INFO nova.virt.libvirt.host [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] UEFI support detected#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.383 226298 INFO nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG oslo.service.loopingcall [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG nova.network.neutron [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.629 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:44.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.777 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.779 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] No waiting events found dispatching network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:03:44 np0005604791 nova_compute[226294]: 2026-02-02 10:03:44.779 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Feb  2 05:03:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:44.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.577 226298 DEBUG nova.network.neutron [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.602 226298 INFO nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 2.22 seconds to deallocate network for instance.#033[00m
Feb  2 05:03:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.666 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.667 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:46.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.747 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.862 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.863 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.863 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.864 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.864 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] No waiting events found dispatching network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.865 226298 WARNING nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received unexpected event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for instance with vm_state deleted and task_state None.#033[00m
Feb  2 05:03:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:46 np0005604791 nova_compute[226294]: 2026-02-02 10:03:46.865 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-deleted-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:03:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:46.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:03:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3299572418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.229 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.236 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.280 226298 ERROR nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [req-c8004082-069f-4a06-823b-f50ed91fc676] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 8e32c057-ad28-4c19-8374-763e0c1c8622.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-c8004082-069f-4a06-823b-f50ed91fc676"}]}#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.298 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.325 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.326 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:03:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.351 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.381 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.430 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:03:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:03:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1182949554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.866 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.872 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:03:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.925 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updated inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.925 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.926 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:03:47 np0005604791 nova_compute[226294]: 2026-02-02 10:03:47.965 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:48 np0005604791 nova_compute[226294]: 2026-02-02 10:03:48.003 226298 INFO nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 7440c4af-7e45-4796-ac03-ddd1eb035702#033[00m
Feb  2 05:03:48 np0005604791 nova_compute[226294]: 2026-02-02 10:03:48.164 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:03:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:48.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:48 np0005604791 nova_compute[226294]: 2026-02-02 10:03:48.867 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:48.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 05:03:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:03:49 np0005604791 nova_compute[226294]: 2026-02-02 10:03:49.631 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:03:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:03:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:50.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:50.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:50 np0005604791 nova_compute[226294]: 2026-02-02 10:03:50.968 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:50 np0005604791 nova_compute[226294]: 2026-02-02 10:03:50.981 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:52.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:52.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 05:03:53 np0005604791 podman[230151]: 2026-02-02 10:03:53.456955558 +0000 UTC m=+0.126004008 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:03:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:53 np0005604791 nova_compute[226294]: 2026-02-02 10:03:53.869 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:53 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:03:53 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3528945408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:03:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:54 np0005604791 nova_compute[226294]: 2026-02-02 10:03:54.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 05:03:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:56.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:03:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:03:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:58.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:03:58 np0005604791 nova_compute[226294]: 2026-02-02 10:03:58.845 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770026623.8443913, 7440c4af-7e45-4796-ac03-ddd1eb035702 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:03:58 np0005604791 nova_compute[226294]: 2026-02-02 10:03:58.846 226298 INFO nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] VM Stopped (Lifecycle Event)#033[00m
Feb  2 05:03:58 np0005604791 nova_compute[226294]: 2026-02-02 10:03:58.864 226298 DEBUG nova.compute.manager [None req-b3a0dc63-054f-480f-8518-83b7344a459f - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:03:58 np0005604791 nova_compute[226294]: 2026-02-02 10:03:58.870 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:03:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:03:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:58.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:03:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100358 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:03:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 05:03:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:03:59 np0005604791 nova_compute[226294]: 2026-02-02 10:03:59.682 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:03:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:03:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:00 np0005604791 podman[230207]: 2026-02-02 10:04:00.394654726 +0000 UTC m=+0.066652131 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:04:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:00.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:01 np0005604791 podman[230348]: 2026-02-02 10:04:01.07693985 +0000 UTC m=+0.068467319 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 05:04:01 np0005604791 podman[230348]: 2026-02-02 10:04:01.155627707 +0000 UTC m=+0.147155186 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb  2 05:04:01 np0005604791 podman[230481]: 2026-02-02 10:04:01.716804844 +0000 UTC m=+0.051834600 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 05:04:01 np0005604791 podman[230481]: 2026-02-02 10:04:01.725448302 +0000 UTC m=+0.060478038 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 05:04:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:01 np0005604791 podman[230554]: 2026-02-02 10:04:01.972095234 +0000 UTC m=+0.060550789 container exec 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Feb  2 05:04:01 np0005604791 podman[230554]: 2026-02-02 10:04:01.982367706 +0000 UTC m=+0.070823261 container exec_died 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Feb  2 05:04:02 np0005604791 podman[230620]: 2026-02-02 10:04:02.216362394 +0000 UTC m=+0.063077017 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 05:04:02 np0005604791 podman[230620]: 2026-02-02 10:04:02.25560625 +0000 UTC m=+0.102320843 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.283451) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642283506, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1114, "num_deletes": 250, "total_data_size": 2480949, "memory_usage": 2518520, "flush_reason": "Manual Compaction"}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642297209, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1022451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23214, "largest_seqno": 24323, "table_properties": {"data_size": 1018432, "index_size": 1607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10580, "raw_average_key_size": 20, "raw_value_size": 1009718, "raw_average_value_size": 1949, "num_data_blocks": 71, "num_entries": 518, "num_filter_entries": 518, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026553, "oldest_key_time": 1770026553, "file_creation_time": 1770026642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 13823 microseconds, and 4723 cpu microseconds.
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.297275) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1022451 bytes OK
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.297298) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298842) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298865) EVENT_LOG_v1 {"time_micros": 1770026642298858, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298887) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 2475493, prev total WAL file size 2475493, number of live WAL files 2.
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.299722) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(998KB)], [42(14MB)]
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642299766, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15733684, "oldest_snapshot_seqno": -1}
Feb  2 05:04:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5503 keys, 12269977 bytes, temperature: kUnknown
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642462914, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12269977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12234320, "index_size": 20827, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 139121, "raw_average_key_size": 25, "raw_value_size": 12135889, "raw_average_value_size": 2205, "num_data_blocks": 850, "num_entries": 5503, "num_filter_entries": 5503, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.463176) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12269977 bytes
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.465941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.4 rd, 75.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 14.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(27.4) write-amplify(12.0) OK, records in: 5983, records dropped: 480 output_compression: NoCompression
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.465961) EVENT_LOG_v1 {"time_micros": 1770026642465952, "job": 24, "event": "compaction_finished", "compaction_time_micros": 163215, "compaction_time_cpu_micros": 33954, "output_level": 6, "num_output_files": 1, "total_output_size": 12269977, "num_input_records": 5983, "num_output_records": 5503, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642466169, "job": 24, "event": "table_file_deletion", "file_number": 44}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642467840, "job": 24, "event": "table_file_deletion", "file_number": 42}
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.299604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:02 np0005604791 podman[230686]: 2026-02-02 10:04:02.485174911 +0000 UTC m=+0.061228607 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, name=keepalived, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, description=keepalived for Ceph, release=1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph.)
Feb  2 05:04:02 np0005604791 podman[230686]: 2026-02-02 10:04:02.52752578 +0000 UTC m=+0.103579446 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, release=1793, name=keepalived, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph)
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:03 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:03 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:03 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 05:04:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:03 np0005604791 nova_compute[226294]: 2026-02-02 10:04:03.871 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:04:04 np0005604791 nova_compute[226294]: 2026-02-02 10:04:04.684 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:04.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:04:04 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3260463622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:04:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:04.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100405 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:04:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:06.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.287 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.301 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.302 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.302 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.313 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.313 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:04:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 nova_compute[226294]: 2026-02-02 10:04:07.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.681 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.681 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:04:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:08 np0005604791 nova_compute[226294]: 2026-02-02 10:04:08.873 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:08.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:04:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2465839043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.137 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.330 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.331 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4928MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.332 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.332 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.406 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.406 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.420 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.725 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:04:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3035330726' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.899 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.905 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:04:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.922 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.946 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:04:09 np0005604791 nova_compute[226294]: 2026-02-02 10:04:09.947 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:04:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:04:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:10.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:10.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:12.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:13 np0005604791 nova_compute[226294]: 2026-02-02 10:04:13.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:14 np0005604791 nova_compute[226294]: 2026-02-02 10:04:14.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:14.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:14.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:15 np0005604791 nova_compute[226294]: 2026-02-02 10:04:15.454 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:15 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:15.455 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:04:15 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:15.457 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:04:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:16.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:18.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:18 np0005604791 nova_compute[226294]: 2026-02-02 10:04:18.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:18.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:19 np0005604791 nova_compute[226294]: 2026-02-02 10:04:19.775 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:20 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:20.459 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:04:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:20.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:20.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:22.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100423 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:04:23 np0005604791 nova_compute[226294]: 2026-02-02 10:04:23.881 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:24 np0005604791 podman[230907]: 2026-02-02 10:04:24.445536063 +0000 UTC m=+0.115014967 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb  2 05:04:24 np0005604791 nova_compute[226294]: 2026-02-02 10:04:24.777 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:24.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:24.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:26.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:26.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:28.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:28 np0005604791 nova_compute[226294]: 2026-02-02 10:04:28.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:28.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:29 np0005604791 nova_compute[226294]: 2026-02-02 10:04:29.823 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:30.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:30.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:31 np0005604791 podman[230937]: 2026-02-02 10:04:31.394101128 +0000 UTC m=+0.070241805 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:04:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:32.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:32.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:33 np0005604791 ovn_controller[133666]: 2026-02-02T10:04:33Z|00036|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Feb  2 05:04:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 05:04:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:33 np0005604791 nova_compute[226294]: 2026-02-02 10:04:33.884 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:34 np0005604791 nova_compute[226294]: 2026-02-02 10:04:34.826 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:34.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:34.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 05:04:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:04:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:36.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:36.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:38.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:38 np0005604791 nova_compute[226294]: 2026-02-02 10:04:38.886 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:38.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 05:04:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:39 np0005604791 nova_compute[226294]: 2026-02-02 10:04:39.870 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:40.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:40.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:42.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:42.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:43 np0005604791 nova_compute[226294]: 2026-02-02 10:04:43.888 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.905 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:04:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:04:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:04:44 np0005604791 nova_compute[226294]: 2026-02-02 10:04:44.907 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100445 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:04:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:46.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:04:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.303089) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687303680, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 760, "num_deletes": 256, "total_data_size": 1528395, "memory_usage": 1554632, "flush_reason": "Manual Compaction"}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687313442, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1010544, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24328, "largest_seqno": 25083, "table_properties": {"data_size": 1006812, "index_size": 1512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8235, "raw_average_key_size": 18, "raw_value_size": 999314, "raw_average_value_size": 2266, "num_data_blocks": 65, "num_entries": 441, "num_filter_entries": 441, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026642, "oldest_key_time": 1770026642, "file_creation_time": 1770026687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10403 microseconds, and 3354 cpu microseconds.
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.313495) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1010544 bytes OK
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.313523) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315774) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315793) EVENT_LOG_v1 {"time_micros": 1770026687315788, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315813) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1524319, prev total WAL file size 1524319, number of live WAL files 2.
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.316379) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353032' seq:0, type:0; will stop at (end)
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(986KB)], [45(11MB)]
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687316431, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13280521, "oldest_snapshot_seqno": -1}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5415 keys, 13121633 bytes, temperature: kUnknown
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687409292, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13121633, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13085284, "index_size": 21711, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138447, "raw_average_key_size": 25, "raw_value_size": 12987133, "raw_average_value_size": 2398, "num_data_blocks": 886, "num_entries": 5415, "num_filter_entries": 5415, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.409646) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13121633 bytes
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.417845) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.8 rd, 141.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.7 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(26.1) write-amplify(13.0) OK, records in: 5944, records dropped: 529 output_compression: NoCompression
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.417876) EVENT_LOG_v1 {"time_micros": 1770026687417863, "job": 26, "event": "compaction_finished", "compaction_time_micros": 92986, "compaction_time_cpu_micros": 18182, "output_level": 6, "num_output_files": 1, "total_output_size": 13121633, "num_input_records": 5944, "num_output_records": 5415, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687418179, "job": 26, "event": "table_file_deletion", "file_number": 47}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687420041, "job": 26, "event": "table_file_deletion", "file_number": 45}
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.316262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:04:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:48.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:48 np0005604791 nova_compute[226294]: 2026-02-02 10:04:48.890 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:49 np0005604791 nova_compute[226294]: 2026-02-02 10:04:49.909 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:50.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:52.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:52.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:53 np0005604791 nova_compute[226294]: 2026-02-02 10:04:53.891 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:54.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:54 np0005604791 nova_compute[226294]: 2026-02-02 10:04:54.910 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:04:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:54.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:04:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:04:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:04:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:04:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:04:55 np0005604791 podman[230996]: 2026-02-02 10:04:55.443307178 +0000 UTC m=+0.107711515 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 05:04:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:56.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:56.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:04:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:58.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:58 np0005604791 nova_compute[226294]: 2026-02-02 10:04:58.892 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:04:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:04:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:58.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:04:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:04:59 np0005604791 nova_compute[226294]: 2026-02-02 10:04:59.911 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:04:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:00.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:00.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:02 np0005604791 podman[231052]: 2026-02-02 10:05:02.374665678 +0000 UTC m=+0.053547255 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb  2 05:05:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:02.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:02.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:03 np0005604791 nova_compute[226294]: 2026-02-02 10:05:03.893 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:04.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:04 np0005604791 nova_compute[226294]: 2026-02-02 10:05:04.913 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:04.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:06.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:06.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:07 np0005604791 nova_compute[226294]: 2026-02-02 10:05:07.946 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:07 np0005604791 nova_compute[226294]: 2026-02-02 10:05:07.947 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:07 np0005604791 nova_compute[226294]: 2026-02-02 10:05:07.947 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:07 np0005604791 nova_compute[226294]: 2026-02-02 10:05:07.948 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:07 np0005604791 nova_compute[226294]: 2026-02-02 10:05:07.948 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:05:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.646 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.664 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.698 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.698 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.699 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.699 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:05:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:08.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:08 np0005604791 nova_compute[226294]: 2026-02-02 10:05:08.895 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:08.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:05:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1611746226' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.156 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.315 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4979MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.400 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.400 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.427 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:05:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:05:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1662336788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.892 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.897 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.911 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.912 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.912 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:05:09 np0005604791 nova_compute[226294]: 2026-02-02 10:05:09.922 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:05:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:05:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:05:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:05:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:10.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:10.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:11 np0005604791 nova_compute[226294]: 2026-02-02 10:05:11.896 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:05:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:12.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:12.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:13 np0005604791 nova_compute[226294]: 2026-02-02 10:05:13.897 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:14.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:14 np0005604791 nova_compute[226294]: 2026-02-02 10:05:14.946 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:14.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:15 np0005604791 nova_compute[226294]: 2026-02-02 10:05:15.702 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:15 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:15.702 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:05:15 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:15.705 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:05:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:16 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:05:16 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:05:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:16.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:18 np0005604791 nova_compute[226294]: 2026-02-02 10:05:18.899 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:18.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:19 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:19.707 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:05:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:19 np0005604791 nova_compute[226294]: 2026-02-02 10:05:19.948 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:20.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:22.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:23 np0005604791 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb  2 05:05:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:23 np0005604791 nova_compute[226294]: 2026-02-02 10:05:23.899 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:24.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:24 np0005604791 nova_compute[226294]: 2026-02-02 10:05:24.986 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:26 np0005604791 podman[231262]: 2026-02-02 10:05:26.480579042 +0000 UTC m=+0.150230242 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb  2 05:05:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:28 np0005604791 nova_compute[226294]: 2026-02-02 10:05:28.931 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:28.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:30 np0005604791 nova_compute[226294]: 2026-02-02 10:05:30.027 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:30.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:32.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:33 np0005604791 podman[231290]: 2026-02-02 10:05:33.40307638 +0000 UTC m=+0.079217256 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:05:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:33 np0005604791 nova_compute[226294]: 2026-02-02 10:05:33.977 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:34.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:35.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:35 np0005604791 nova_compute[226294]: 2026-02-02 10:05:35.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:36.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:37.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100539 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:05:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:39 np0005604791 nova_compute[226294]: 2026-02-02 10:05:39.030 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:39.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:40 np0005604791 nova_compute[226294]: 2026-02-02 10:05:40.063 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:40.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:41.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:42.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:43.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:44 np0005604791 nova_compute[226294]: 2026-02-02 10:05:44.063 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.905 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:05:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:05:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:05:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:45.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:45 np0005604791 nova_compute[226294]: 2026-02-02 10:05:45.065 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:46.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 05:05:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:48.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:49.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:49 np0005604791 nova_compute[226294]: 2026-02-02 10:05:49.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:50 np0005604791 nova_compute[226294]: 2026-02-02 10:05:50.067 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:50.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:51.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 05:05:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:05:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:05:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:05:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:54 np0005604791 nova_compute[226294]: 2026-02-02 10:05:54.079 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 05:05:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:55.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:55 np0005604791 nova_compute[226294]: 2026-02-02 10:05:55.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:56.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:57.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:05:57 np0005604791 podman[231349]: 2026-02-02 10:05:57.457443534 +0000 UTC m=+0.125536246 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 05:05:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:05:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:58.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:05:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:05:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:05:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:59.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:05:59 np0005604791 nova_compute[226294]: 2026-02-02 10:05:59.112 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:05:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:05:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:00 np0005604791 nova_compute[226294]: 2026-02-02 10:06:00.071 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:00.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100601 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:06:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:01.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.716554) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762716595, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 984, "num_deletes": 251, "total_data_size": 2136220, "memory_usage": 2174248, "flush_reason": "Manual Compaction"}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762734395, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1411299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25088, "largest_seqno": 26067, "table_properties": {"data_size": 1406868, "index_size": 2083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10028, "raw_average_key_size": 19, "raw_value_size": 1397892, "raw_average_value_size": 2751, "num_data_blocks": 93, "num_entries": 508, "num_filter_entries": 508, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026687, "oldest_key_time": 1770026687, "file_creation_time": 1770026762, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 17929 microseconds, and 4816 cpu microseconds.
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.734478) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1411299 bytes OK
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.734502) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736925) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736952) EVENT_LOG_v1 {"time_micros": 1770026762736944, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736974) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2131277, prev total WAL file size 2131277, number of live WAL files 2.
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.737599) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1378KB)], [48(12MB)]
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762737640, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14532932, "oldest_snapshot_seqno": -1}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5407 keys, 12364573 bytes, temperature: kUnknown
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762827031, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12364573, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12328990, "index_size": 20945, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138963, "raw_average_key_size": 25, "raw_value_size": 12231647, "raw_average_value_size": 2262, "num_data_blocks": 851, "num_entries": 5407, "num_filter_entries": 5407, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026762, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.827361) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12364573 bytes
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.831443) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.4 rd, 138.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.5 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(19.1) write-amplify(8.8) OK, records in: 5923, records dropped: 516 output_compression: NoCompression
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.831471) EVENT_LOG_v1 {"time_micros": 1770026762831458, "job": 28, "event": "compaction_finished", "compaction_time_micros": 89471, "compaction_time_cpu_micros": 32110, "output_level": 6, "num_output_files": 1, "total_output_size": 12364573, "num_input_records": 5923, "num_output_records": 5407, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762831844, "job": 28, "event": "table_file_deletion", "file_number": 50}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762833744, "job": 28, "event": "table_file_deletion", "file_number": 48}
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.737534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833875) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:06:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:02.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:03.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:04 np0005604791 nova_compute[226294]: 2026-02-02 10:06:04.116 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:04 np0005604791 podman[231403]: 2026-02-02 10:06:04.399404109 +0000 UTC m=+0.067840044 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:06:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:04.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:05.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:05 np0005604791 nova_compute[226294]: 2026-02-02 10:06:05.074 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:06.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:07 np0005604791 nova_compute[226294]: 2026-02-02 10:06:07.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:08 np0005604791 nova_compute[226294]: 2026-02-02 10:06:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:08 np0005604791 nova_compute[226294]: 2026-02-02 10:06:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:08 np0005604791 nova_compute[226294]: 2026-02-02 10:06:08.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:06:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:08.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:09.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:09 np0005604791 nova_compute[226294]: 2026-02-02 10:06:09.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:09 np0005604791 nova_compute[226294]: 2026-02-02 10:06:09.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:09 np0005604791 nova_compute[226294]: 2026-02-02 10:06:09.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:09 np0005604791 nova_compute[226294]: 2026-02-02 10:06:09.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.721 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.722 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.783 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.784 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.784 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.785 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:06:10 np0005604791 nova_compute[226294]: 2026-02-02 10:06:10.785 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:06:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:10.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:11.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:06:11 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/771773473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:06:11 np0005604791 nova_compute[226294]: 2026-02-02 10:06:11.302 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:06:11 np0005604791 nova_compute[226294]: 2026-02-02 10:06:11.484 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:06:11 np0005604791 nova_compute[226294]: 2026-02-02 10:06:11.486 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4972MB free_disk=59.92196273803711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:06:11 np0005604791 nova_compute[226294]: 2026-02-02 10:06:11.487 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:06:11 np0005604791 nova_compute[226294]: 2026-02-02 10:06:11.487 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:06:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.113 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.114 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.134 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:06:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:06:12 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3165935581' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.573 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.578 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.612 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.613 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:06:12 np0005604791 nova_compute[226294]: 2026-02-02 10:06:12.613 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:06:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:12.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:13 np0005604791 nova_compute[226294]: 2026-02-02 10:06:13.541 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:06:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:14 np0005604791 nova_compute[226294]: 2026-02-02 10:06:14.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:14.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:15.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:15 np0005604791 nova_compute[226294]: 2026-02-02 10:06:15.085 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:17.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:18.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:19.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:19 np0005604791 nova_compute[226294]: 2026-02-02 10:06:19.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:20 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:06:20 np0005604791 nova_compute[226294]: 2026-02-02 10:06:20.086 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:20.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:21.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:22.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:23.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:24 np0005604791 nova_compute[226294]: 2026-02-02 10:06:24.130 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:24.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:25 np0005604791 nova_compute[226294]: 2026-02-02 10:06:25.088 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:25.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:25 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:06:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:06:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:27.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:06:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:28 np0005604791 podman[231612]: 2026-02-02 10:06:28.434072 +0000 UTC m=+0.095354294 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb  2 05:06:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:28.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:29.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:29 np0005604791 nova_compute[226294]: 2026-02-02 10:06:29.132 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:30 np0005604791 nova_compute[226294]: 2026-02-02 10:06:30.090 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:30.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:31.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:33.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:33.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:34 np0005604791 nova_compute[226294]: 2026-02-02 10:06:34.135 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:35.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:35.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:35 np0005604791 nova_compute[226294]: 2026-02-02 10:06:35.119 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:35 np0005604791 podman[231642]: 2026-02-02 10:06:35.397088433 +0000 UTC m=+0.076801252 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb  2 05:06:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:37.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:37.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:38 np0005604791 nova_compute[226294]: 2026-02-02 10:06:38.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:38.060 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:06:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:38.062 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:06:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:39.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:39.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:39 np0005604791 nova_compute[226294]: 2026-02-02 10:06:39.137 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:40 np0005604791 nova_compute[226294]: 2026-02-02 10:06:40.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:41.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:41.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:42 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:42.064 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:06:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:06:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:43.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:06:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:43.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:44 np0005604791 nova_compute[226294]: 2026-02-02 10:06:44.140 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:06:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:06:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:06:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:45.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:45.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:45 np0005604791 nova_compute[226294]: 2026-02-02 10:06:45.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:47.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:49.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:49 np0005604791 nova_compute[226294]: 2026-02-02 10:06:49.143 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:50 np0005604791 nova_compute[226294]: 2026-02-02 10:06:50.228 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:51.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:51.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:53.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:53.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:54 np0005604791 nova_compute[226294]: 2026-02-02 10:06:54.146 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:06:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:55.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:06:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:55.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:55 np0005604791 nova_compute[226294]: 2026-02-02 10:06:55.254 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:57.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:06:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:06:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:06:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:59.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:06:59 np0005604791 nova_compute[226294]: 2026-02-02 10:06:59.149 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:06:59 np0005604791 podman[231703]: 2026-02-02 10:06:59.410669524 +0000 UTC m=+0.075175519 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:06:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:06:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:00 np0005604791 nova_compute[226294]: 2026-02-02 10:07:00.283 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:01.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.102 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.102 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.125 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Feb  2 05:07:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:01.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.190 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.190 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.196 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.196 226298 INFO nova.compute.claims [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Claim successful on node compute-1.ctlplane.example.com#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.289 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:01 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:07:01 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/198600574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.760 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.765 226298 DEBUG nova.compute.provider_tree [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.787 226298 DEBUG nova.scheduler.client.report [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.811 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.812 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.869 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.870 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Feb  2 05:07:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.890 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Feb  2 05:07:01 np0005604791 nova_compute[226294]: 2026-02-02 10:07:01.909 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Feb  2 05:07:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.007 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.008 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.008 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating image(s)#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.035 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.067 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.094 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.098 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.156 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.157 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.159 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.159 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.198 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.202 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.572 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.663 226298 DEBUG nova.policy [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.674 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] resizing rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.827 226298 DEBUG nova.objects.instance [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'migration_context' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.843 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.843 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Ensure instance console log exists: /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:02 np0005604791 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:03.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:04 np0005604791 nova_compute[226294]: 2026-02-02 10:07:04.152 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:04 np0005604791 nova_compute[226294]: 2026-02-02 10:07:04.672 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully created port: 09a00258-4f60-42dd-a769-b2ea3b870187 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Feb  2 05:07:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.329 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.422 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully updated port: 09a00258-4f60-42dd-a769-b2ea3b870187 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.540 226298 DEBUG nova.compute.manager [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.541 226298 DEBUG nova.compute.manager [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.541 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:05 np0005604791 nova_compute[226294]: 2026-02-02 10:07:05.590 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Feb  2 05:07:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:06 np0005604791 podman[231946]: 2026-02-02 10:07:06.367725129 +0000 UTC m=+0.048363576 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:07:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.676 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.709 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.709 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance network_info: |[{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.710 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.710 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.713 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start _get_guest_xml network_info=[{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': 'd5e062d7-95ef-409c-9ad0-60f7cf6f44ce'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.718 226298 WARNING nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.722 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.723 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.726 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.726 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.727 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.727 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-02T10:01:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1194feb9-e285-414e-825a-1e77171d092f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.730 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.730 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Feb  2 05:07:06 np0005604791 nova_compute[226294]: 2026-02-02 10:07:06.733 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:07.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb  2 05:07:07 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1442920525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb  2 05:07:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:07.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.142 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.167 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.170 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb  2 05:07:07 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/671941752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.576 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.578 226298 DEBUG nova.virt.libvirt.vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:07:01Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.578 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.579 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.580 226298 DEBUG nova.objects.instance [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.599 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] End _get_guest_xml xml=<domain type="kvm">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <name>instance-00000006</name>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <memory>131072</memory>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <vcpu>1</vcpu>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:creationTime>2026-02-02 10:07:06</nova:creationTime>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:flavor name="m1.nano">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:memory>128</nova:memory>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:disk>1</nova:disk>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:swap>0</nova:swap>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:vcpus>1</nova:vcpus>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </nova:flavor>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:owner>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </nova:owner>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <nova:ports>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        </nova:port>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </nova:ports>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </nova:instance>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <sysinfo type="smbios">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="manufacturer">RDO</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="product">OpenStack Compute</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="serial">15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="uuid">15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <entry name="family">Virtual Machine</entry>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <type arch="x86_64" machine="q35">hvm</type>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <boot dev="hd"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <smbios mode="sysinfo"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <vmcoreinfo/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <clock offset="utc">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <timer name="pit" tickpolicy="delay"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <timer name="rtc" tickpolicy="catchup"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <timer name="hpet" present="no"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <cpu mode="host-model" match="exact">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <topology sockets="1" cores="1" threads="1"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <disk type="network" device="disk">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <driver type="raw" cache="none"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <source protocol="rbd" name="vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.100" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.102" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.101" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <auth username="openstack">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <target dev="vda" bus="virtio"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <disk type="network" device="cdrom">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <driver type="raw" cache="none"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <source protocol="rbd" name="vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.100" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.102" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <host name="192.168.122.101" port="6789"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <auth username="openstack">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:        <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <target dev="sda" bus="sata"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <interface type="ethernet">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <mac address="fa:16:3e:85:9a:96"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <model type="virtio"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <driver name="vhost" rx_queue_size="512"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <mtu size="1442"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <target dev="tap09a00258-4f"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <serial type="pty">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <log file="/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log" append="off"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <model type="virtio"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <input type="tablet" bus="usb"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <rng model="virtio">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <backend model="random">/dev/urandom</backend>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <controller type="usb" index="0"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    <memballoon model="virtio">
Feb  2 05:07:07 np0005604791 nova_compute[226294]:      <stats period="10"/>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:07:07 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:07:07 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:07:07 np0005604791 nova_compute[226294]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Preparing to wait for external event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.601 226298 DEBUG nova.virt.libvirt.vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:07:01Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.601 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.602 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.602 226298 DEBUG os_vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.603 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.603 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.604 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09a00258-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09a00258-4f, col_values=(('external_ids', {'iface-id': '09a00258-4f60-42dd-a769-b2ea3b870187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:9a:96', 'vm-uuid': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.610 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:07 np0005604791 NetworkManager[49055]: <info>  [1770026827.6118] manager: (tap09a00258-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.612 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.616 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.616 226298 INFO os_vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f')#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:85:9a:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.674 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Using config drive#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.700 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.804 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.805 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:07 np0005604791 nova_compute[226294]: 2026-02-02 10:07:07.820 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.011 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating config drive at /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.014 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8eow0059 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.130 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8eow0059" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.159 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.163 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.338 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.339 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deleting local config drive /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config because it was imported into RBD.#033[00m
Feb  2 05:07:08 np0005604791 systemd[1]: Starting libvirt secret daemon...
Feb  2 05:07:08 np0005604791 systemd[1]: Started libvirt secret daemon.
Feb  2 05:07:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:08 np0005604791 kernel: tap09a00258-4f: entered promiscuous mode
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.4072] manager: (tap09a00258-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb  2 05:07:08 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:08Z|00037|binding|INFO|Claiming lport 09a00258-4f60-42dd-a769-b2ea3b870187 for this chassis.
Feb  2 05:07:08 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:08Z|00038|binding|INFO|09a00258-4f60-42dd-a769-b2ea3b870187: Claiming fa:16:3e:85:9a:96 10.100.0.10
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.408 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.413 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.423 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:9a:96 10.100.0.10'], port_security=['fa:16:3e:85:9a:96 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09104532-215f-4de3-9920-7fd818e6c676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=755f8a60-018a-461f-bb4b-b9017895ccf7, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=09a00258-4f60-42dd-a769-b2ea3b870187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.425 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 09a00258-4f60-42dd-a769-b2ea3b870187 in datapath ba6c4c87-77a9-4fcc-aa14-a4637c78f692 bound to our chassis#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.426 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba6c4c87-77a9-4fcc-aa14-a4637c78f692#033[00m
Feb  2 05:07:08 np0005604791 systemd-machined[195072]: New machine qemu-2-instance-00000006.
Feb  2 05:07:08 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:08Z|00039|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 ovn-installed in OVS
Feb  2 05:07:08 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:08Z|00040|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 up in Southbound
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.434 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.436 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[65e199ed-06de-4d41-a2eb-33014dbc1bb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.436 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba6c4c87-71 in ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba6c4c87-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4506bab7-892e-42e2-b5db-173aa418d42d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[0465fb71-833b-4c77-84fb-afbfdb434584]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.448 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[9d47535b-dd51-4276-9cba-183180b95bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 systemd-udevd[232123]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.455 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[10a1883d-f0e4-413e-a01c-b4cff4d6f13d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.4647] device (tap09a00258-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.4654] device (tap09a00258-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.473 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[59b31869-9266-4c89-88b6-3c8f50d232a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.4777] manager: (tapba6c4c87-70): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.476 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c1a2c3-9091-491d-ae28-c07a97f6646e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.496 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0a08a144-e314-48a0-a5cb-54fb507e7430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.498 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[639aed7a-33ca-4d5a-9a1b-a84df73d8a52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.5112] device (tapba6c4c87-70): carrier: link connected
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.513 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[80f3ffbc-ce86-4b5e-b3e9-9ae90c6f2f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.525 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[03600d44-1fc6-4f3c-9619-ccd403c15139]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba6c4c87-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:56:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 19289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232154, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.538 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd17a88-e59a-49b8-bad1-c19ba141fe29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:567d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399082, 'tstamp': 399082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232155, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.549 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4af456-b786-4f47-aaa1-4405db578468]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba6c4c87-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:56:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 19289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.571 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6236fccb-2caa-41a0-8f1d-f1603319bc5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.609 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[61c0c742-7b6f-42a7-a608-138bdfb509f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.610 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba6c4c87-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.610 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.611 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba6c4c87-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.639 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 NetworkManager[49055]: <info>  [1770026828.6397] manager: (tapba6c4c87-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb  2 05:07:08 np0005604791 kernel: tapba6c4c87-70: entered promiscuous mode
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.642 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.643 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba6c4c87-70, col_values=(('external_ids', {'iface-id': 'f5df8d3e-4c61-4492-9e28-98679c02afcc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.644 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:08Z|00041|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.651 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.651 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.652 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f1810634-da25-43aa-9de4-2ce0f394addd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.653 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: global
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    log         /dev/log local0 debug
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    log-tag     haproxy-metadata-proxy-ba6c4c87-77a9-4fcc-aa14-a4637c78f692
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    user        root
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    group       root
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    maxconn     1024
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    pidfile     /var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    daemon
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: defaults
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    log global
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    mode http
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    option httplog
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    option dontlognull
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    option http-server-close
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    option forwardfor
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    retries                 3
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    timeout http-request    30s
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    timeout connect         30s
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    timeout client          32s
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    timeout server          32s
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    timeout http-keep-alive 30s
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: listen listener
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    bind 169.254.169.254:80
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    server metadata /var/lib/neutron/metadata_proxy
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]:    http-request add-header X-OVN-Network-ID ba6c4c87-77a9-4fcc-aa14-a4637c78f692
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Feb  2 05:07:08 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.653 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'env', 'PROCESS_TAG=haproxy-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.764 226298 DEBUG nova.compute.manager [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:08 np0005604791 nova_compute[226294]: 2026-02-02 10:07:08.766 226298 DEBUG nova.compute.manager [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Processing event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Feb  2 05:07:08 np0005604791 podman[232188]: 2026-02-02 10:07:08.95946637 +0000 UTC m=+0.053961975 container create 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:07:08 np0005604791 systemd[1]: Started libpod-conmon-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope.
Feb  2 05:07:09 np0005604791 systemd[1]: Started libcrun container.
Feb  2 05:07:09 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7e71b523a6cf72f6079510db5422c0e2666a6b8442a4c07506d8ee1c5789881/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb  2 05:07:09 np0005604791 podman[232188]: 2026-02-02 10:07:08.93687982 +0000 UTC m=+0.031375445 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 05:07:09 np0005604791 podman[232188]: 2026-02-02 10:07:09.034332129 +0000 UTC m=+0.128827784 container init 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb  2 05:07:09 np0005604791 podman[232188]: 2026-02-02 10:07:09.03887185 +0000 UTC m=+0.133367445 container start 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:07:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:09.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:09 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : New worker (232217) forked
Feb  2 05:07:09 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : Loading success.
Feb  2 05:07:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:09.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.197 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.197551, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.198 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Started (Lifecycle Event)#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.199 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.202 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.204 226298 INFO nova.virt.libvirt.driver [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance spawned successfully.#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.205 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.226 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.230 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.231 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.231 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.232 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.232 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.233 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.237 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.265 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.266 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.1976917, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.266 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Paused (Lifecycle Event)#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.289 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.292 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.2014902, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.292 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Resumed (Lifecycle Event)#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.300 226298 INFO nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 7.29 seconds to spawn the instance on the hypervisor.#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.300 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.312 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.315 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.341 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.357 226298 INFO nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 8.19 seconds to build instance.#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.379 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:09 np0005604791 nova_compute[226294]: 2026-02-02 10:07:09.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.331 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.792 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.793 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.884 226298 DEBUG nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.884 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:07:10 np0005604791 nova_compute[226294]: 2026-02-02 10:07:10.886 226298 WARNING nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with vm_state active and task_state None.#033[00m
Feb  2 05:07:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:11.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:11.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490004140 fd 49 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:12 np0005604791 nova_compute[226294]: 2026-02-02 10:07:12.610 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:13.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:13.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.775 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:13 np0005604791 NetworkManager[49055]: <info>  [1770026833.7765] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb  2 05:07:13 np0005604791 NetworkManager[49055]: <info>  [1770026833.7776] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb  2 05:07:13 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:13Z|00042|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.788 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:13 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:13Z|00043|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.792 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.873 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.896 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.897 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.897 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:13 np0005604791 nova_compute[226294]: 2026-02-02 10:07:13.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:07:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.039 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.039 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:14 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:07:14 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/386387081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:07:14 np0005604791 nova_compute[226294]: 2026-02-02 10:07:14.569 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:15.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.149 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.150 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.286 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4787MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.329 226298 DEBUG nova.compute.manager [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG nova.compute.manager [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.331 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.369 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.410 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.411 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.411 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.455 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:07:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:07:15 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1423075345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:07:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.909 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.918 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.941 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.973 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:07:15 np0005604791 nova_compute[226294]: 2026-02-02 10:07:15.974 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:17.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:17 np0005604791 nova_compute[226294]: 2026-02-02 10:07:17.612 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:18 np0005604791 nova_compute[226294]: 2026-02-02 10:07:18.177 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:07:18 np0005604791 nova_compute[226294]: 2026-02-02 10:07:18.178 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:18 np0005604791 nova_compute[226294]: 2026-02-02 10:07:18.213 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:19.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:19.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:20 np0005604791 nova_compute[226294]: 2026-02-02 10:07:20.371 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:21.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:21.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:22 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:22Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:9a:96 10.100.0.10
Feb  2 05:07:22 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:22Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:9a:96 10.100.0.10
Feb  2 05:07:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:22 np0005604791 nova_compute[226294]: 2026-02-02 10:07:22.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:23.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:23.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:25.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:25.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:25 np0005604791 nova_compute[226294]: 2026-02-02 10:07:25.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:07:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:07:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:07:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:07:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:27.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:27 np0005604791 nova_compute[226294]: 2026-02-02 10:07:27.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:27 np0005604791 nova_compute[226294]: 2026-02-02 10:07:27.753 226298 INFO nova.compute.manager [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Get console output#033[00m
Feb  2 05:07:27 np0005604791 nova_compute[226294]: 2026-02-02 10:07:27.759 226298 INFO oslo.privsep.daemon [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp6yb7dc4f/privsep.sock']#033[00m
Feb  2 05:07:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.429 226298 INFO oslo.privsep.daemon [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Spawned new privsep daemon via rootwrap#033[00m
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.299 232427 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.302 232427 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.304 232427 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.304 232427 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232427#033[00m
Feb  2 05:07:28 np0005604791 nova_compute[226294]: 2026-02-02 10:07:28.531 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Feb  2 05:07:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:29.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:29.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:30 np0005604791 nova_compute[226294]: 2026-02-02 10:07:30.375 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:30 np0005604791 podman[232430]: 2026-02-02 10:07:30.43933429 +0000 UTC m=+0.106256475 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb  2 05:07:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:31.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:31.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004630 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:07:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:07:32 np0005604791 nova_compute[226294]: 2026-02-02 10:07:32.110 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:32 np0005604791 nova_compute[226294]: 2026-02-02 10:07:32.111 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:32 np0005604791 nova_compute[226294]: 2026-02-02 10:07:32.112 226298 DEBUG nova.objects.instance [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:07:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:32 np0005604791 nova_compute[226294]: 2026-02-02 10:07:32.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:33.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:33.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:33 np0005604791 nova_compute[226294]: 2026-02-02 10:07:33.367 226298 DEBUG nova.objects.instance [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_requests' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:07:33 np0005604791 nova_compute[226294]: 2026-02-02 10:07:33.385 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Feb  2 05:07:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004650 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:34 np0005604791 nova_compute[226294]: 2026-02-02 10:07:34.157 226298 DEBUG nova.policy [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Feb  2 05:07:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:35.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:35.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:35 np0005604791 nova_compute[226294]: 2026-02-02 10:07:35.283 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully created port: c66e0be1-d166-4088-8ad8-baa84f3d032d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Feb  2 05:07:35 np0005604791 nova_compute[226294]: 2026-02-02 10:07:35.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.516 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully updated port: c66e0be1-d166-4088-8ad8-baa84f3d032d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.639 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.640 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.640 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.724 226298 DEBUG nova.compute.manager [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.725 226298 DEBUG nova.compute.manager [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:07:36 np0005604791 nova_compute[226294]: 2026-02-02 10:07:36.725 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:07:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:37.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:37.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:37 np0005604791 podman[232486]: 2026-02-02 10:07:37.414124867 +0000 UTC m=+0.090033523 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Feb  2 05:07:37 np0005604791 nova_compute[226294]: 2026-02-02 10:07:37.703 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.740 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.786 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.787 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.787 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.789 226298 DEBUG nova.virt.libvirt.vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.790 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.790 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.791 226298 DEBUG os_vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.791 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.792 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.792 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.795 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.795 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc66e0be1-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.796 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc66e0be1-d1, col_values=(('external_ids', {'iface-id': 'c66e0be1-d166-4088-8ad8-baa84f3d032d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:49:24', 'vm-uuid': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.798 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.8004] manager: (tapc66e0be1-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.806 226298 INFO os_vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.807 226298 DEBUG nova.virt.libvirt.vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.808 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.808 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.811 226298 DEBUG nova.virt.libvirt.guest [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] attach device xml: <interface type="ethernet">
Feb  2 05:07:38 np0005604791 nova_compute[226294]:  <mac address="fa:16:3e:2f:49:24"/>
Feb  2 05:07:38 np0005604791 nova_compute[226294]:  <model type="virtio"/>
Feb  2 05:07:38 np0005604791 nova_compute[226294]:  <driver name="vhost" rx_queue_size="512"/>
Feb  2 05:07:38 np0005604791 nova_compute[226294]:  <mtu size="1442"/>
Feb  2 05:07:38 np0005604791 nova_compute[226294]:  <target dev="tapc66e0be1-d1"/>
Feb  2 05:07:38 np0005604791 nova_compute[226294]: </interface>
Feb  2 05:07:38 np0005604791 nova_compute[226294]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Feb  2 05:07:38 np0005604791 kernel: tapc66e0be1-d1: entered promiscuous mode
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.8216] manager: (tapc66e0be1-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb  2 05:07:38 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:38Z|00044|binding|INFO|Claiming lport c66e0be1-d166-4088-8ad8-baa84f3d032d for this chassis.
Feb  2 05:07:38 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:38Z|00045|binding|INFO|c66e0be1-d166-4088-8ad8-baa84f3d032d: Claiming fa:16:3e:2f:49:24 10.100.0.18
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.851 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 systemd-udevd[232512]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 05:07:38 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:38Z|00046|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d ovn-installed in OVS
Feb  2 05:07:38 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:38Z|00047|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d up in Southbound
Feb  2 05:07:38 np0005604791 nova_compute[226294]: 2026-02-02 10:07:38.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.856 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:49:24 10.100.0.18'], port_security=['fa:16:3e:2f:49:24 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e125f54e-7556-49c5-8356-e7390df43c53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22473684-a0d2-4e4f-b1c5-3e6fdbc49578', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9d42b65-630e-4d58-b649-2acc01d097b4, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=c66e0be1-d166-4088-8ad8-baa84f3d032d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.860 143542 INFO neutron.agent.ovn.metadata.agent [-] Port c66e0be1-d166-4088-8ad8-baa84f3d032d in datapath e125f54e-7556-49c5-8356-e7390df43c53 bound to our chassis#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.863 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e125f54e-7556-49c5-8356-e7390df43c53#033[00m
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.8688] device (tapc66e0be1-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.8698] device (tapc66e0be1-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.874 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6b57f2f3-eaad-4696-b0f2-0b4ca9b61460]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.877 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape125f54e-71 in ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.879 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape125f54e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.879 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16e039c8-837d-46dd-aac0-d66c92ae2c06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.881 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[30a942ce-dec5-4e94-8c7c-e29610e9d54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.893 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd13150-8a38-4993-ab91-3f9cd30d6253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.908 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[888ba296-52c3-426d-8199-8cd9ffa4f6c8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.932 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcae479-f76a-4623-b06f-ccdae8f3db50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.9375] manager: (tape125f54e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.938 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2d92357e-39e5-4ef0-ad04-54654d0c1e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.962 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[eb99aa16-e1b7-4faf-a28f-9e0efe9b3ae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.966 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[2aba1a51-6951-4948-8304-0ae2e266883c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:38 np0005604791 NetworkManager[49055]: <info>  [1770026858.9881] device (tape125f54e-70): carrier: link connected
Feb  2 05:07:38 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.992 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7e538b-0da0-4b1e-8d92-559254d390e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.010 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6c16237c-7bf4-4937-8151-58b6d3f36c57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape125f54e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402129, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232539, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.017 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.018 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.018 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:85:9a:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.019 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:2f:49:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.025 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e85fc9-cbab-411b-b7e1-5b0ced62b883]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:b741'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402129, 'tstamp': 402129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232540, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.039 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[702f1213-38d5-4d51-b43f-21a976ccb059]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape125f54e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402129, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232541, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.062 226298 DEBUG nova.virt.libvirt.guest [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:07:39 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb  2 05:07:39 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:07:39 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:07:39 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:07:39 np0005604791 nova_compute[226294]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.063 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[8913437d-f33b-4530-9121-b3e1207c95d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.097 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:39.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.111 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16792929-3308-48a0-a430-30da1b7efdbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.112 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape125f54e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.113 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.114 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape125f54e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.116 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:39 np0005604791 kernel: tape125f54e-70: entered promiscuous mode
Feb  2 05:07:39 np0005604791 NetworkManager[49055]: <info>  [1770026859.1181] manager: (tape125f54e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.120 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape125f54e-70, col_values=(('external_ids', {'iface-id': '4948ba2f-4901-4550-ab74-f4adf1b82ea1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:39 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:39Z|00048|binding|INFO|Releasing lport 4948ba2f-4901-4550-ab74-f4adf1b82ea1 from this chassis (sb_readonly=0)
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.128 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.129 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.130 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c363a1dc-879f-470e-af0f-a6f47d9c6e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.131 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: global
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    log         /dev/log local0 debug
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    log-tag     haproxy-metadata-proxy-e125f54e-7556-49c5-8356-e7390df43c53
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    user        root
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    group       root
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    maxconn     1024
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    pidfile     /var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    daemon
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: defaults
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    log global
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    mode http
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    option httplog
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    option dontlognull
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    option http-server-close
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    option forwardfor
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    retries                 3
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    timeout http-request    30s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    timeout connect         30s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    timeout client          32s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    timeout server          32s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    timeout http-keep-alive 30s
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: listen listener
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    bind 169.254.169.254:80
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    server metadata /var/lib/neutron/metadata_proxy
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]:    http-request add-header X-OVN-Network-ID e125f54e-7556-49c5-8356-e7390df43c53
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.131 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'env', 'PROCESS_TAG=haproxy-e125f54e-7556-49c5-8356-e7390df43c53', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e125f54e-7556-49c5-8356-e7390df43c53.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Feb  2 05:07:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.353 226298 DEBUG nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.355 226298 DEBUG nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.355 226298 WARNING nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.#033[00m
Feb  2 05:07:39 np0005604791 podman[232573]: 2026-02-02 10:07:39.461656959 +0000 UTC m=+0.031982230 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 05:07:39 np0005604791 podman[232573]: 2026-02-02 10:07:39.575644308 +0000 UTC m=+0.145969539 container create e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb  2 05:07:39 np0005604791 systemd[1]: Started libpod-conmon-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope.
Feb  2 05:07:39 np0005604791 systemd[1]: Started libcrun container.
Feb  2 05:07:39 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7028958bbb9ac49e703bb1728fefda69b8f73736997e2045bf747f59bb53233/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb  2 05:07:39 np0005604791 podman[232573]: 2026-02-02 10:07:39.683222806 +0000 UTC m=+0.253548047 container init e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:07:39 np0005604791 podman[232573]: 2026-02-02 10:07:39.687578652 +0000 UTC m=+0.257903873 container start e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:07:39 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : New worker (232619) forked
Feb  2 05:07:39 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : Loading success.
Feb  2 05:07:39 np0005604791 nova_compute[226294]: 2026-02-02 10:07:39.744 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.745 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:07:39 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.746 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:07:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:40 np0005604791 nova_compute[226294]: 2026-02-02 10:07:40.225 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:07:40 np0005604791 nova_compute[226294]: 2026-02-02 10:07:40.225 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:07:40 np0005604791 nova_compute[226294]: 2026-02-02 10:07:40.239 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:07:40 np0005604791 nova_compute[226294]: 2026-02-02 10:07:40.418 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:41.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:41 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:41Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:49:24 10.100.0.18
Feb  2 05:07:41 np0005604791 ovn_controller[133666]: 2026-02-02T10:07:41Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:49:24 10.100.0.18
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.455 226298 DEBUG nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.455 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:07:41 np0005604791 nova_compute[226294]: 2026-02-02 10:07:41.457 226298 WARNING nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.#033[00m
Feb  2 05:07:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:43.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:43 np0005604791 nova_compute[226294]: 2026-02-02 10:07:43.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100744 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:07:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:07:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:07:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:07:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:45.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100745 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:07:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:45.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:45 np0005604791 nova_compute[226294]: 2026-02-02 10:07:45.421 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:47.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:07:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:47.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:07:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:48 np0005604791 nova_compute[226294]: 2026-02-02 10:07:48.843 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:49.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:49 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:07:49.749 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:07:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:50 np0005604791 nova_compute[226294]: 2026-02-02 10:07:50.462 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:51.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:51.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb  2 05:07:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:53.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:53.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:53 np0005604791 nova_compute[226294]: 2026-02-02 10:07:53.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:55.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:55.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:55 np0005604791 nova_compute[226294]: 2026-02-02 10:07:55.466 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb  2 05:07:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:07:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:07:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:57.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:57.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:07:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb  2 05:07:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:07:58 np0005604791 nova_compute[226294]: 2026-02-02 10:07:58.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:07:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:07:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:59.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:07:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:07:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:07:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:59.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:07:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:00 np0005604791 nova_compute[226294]: 2026-02-02 10:08:00.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb  2 05:08:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:08:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:08:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:01.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:01 np0005604791 podman[232666]: 2026-02-02 10:08:01.496227236 +0000 UTC m=+0.156197001 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:08:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:03.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:03.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:03 np0005604791 nova_compute[226294]: 2026-02-02 10:08:03.936 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100804 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:08:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:05.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:05.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:05 np0005604791 nova_compute[226294]: 2026-02-02 10:08:05.470 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100807 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb  2 05:08:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:07.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:08 np0005604791 podman[232695]: 2026-02-02 10:08:08.389306911 +0000 UTC m=+0.069894048 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:08:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:08 np0005604791 nova_compute[226294]: 2026-02-02 10:08:08.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:09.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:09.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:08:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/961067775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:08:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:10 np0005604791 nova_compute[226294]: 2026-02-02 10:08:10.473 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:10 np0005604791 nova_compute[226294]: 2026-02-02 10:08:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:10 np0005604791 nova_compute[226294]: 2026-02-02 10:08:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:08:10 np0005604791 nova_compute[226294]: 2026-02-02 10:08:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:08:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:11.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:11 np0005604791 nova_compute[226294]: 2026-02-02 10:08:11.162 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:08:11 np0005604791 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:08:11 np0005604791 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Feb  2 05:08:11 np0005604791 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:08:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:11.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:13.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:13.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:13 np0005604791 nova_compute[226294]: 2026-02-02 10:08:13.988 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.187 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:08:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:15.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.274 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.275 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.276 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.276 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.277 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.277 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.278 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.278 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.279 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.279 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.348 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.350 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.350 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:08:15 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2018632270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.804 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.864 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:08:15 np0005604791 nova_compute[226294]: 2026-02-02 10:08:15.865 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:08:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.070 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.071 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=59.8979606628418GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.072 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.072 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.212 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.213 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.213 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.371 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:08:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001a30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:16 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:08:16 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2361210182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.808 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.849 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.869 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.870 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Feb  2 05:08:16 np0005604791 nova_compute[226294]: 2026-02-02 10:08:16.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:17.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:17 np0005604791 nova_compute[226294]: 2026-02-02 10:08:17.904 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:17 np0005604791 nova_compute[226294]: 2026-02-02 10:08:17.905 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:17 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:18 np0005604791 nova_compute[226294]: 2026-02-02 10:08:18.992 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:19.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:19 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001a30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:20 np0005604791 nova_compute[226294]: 2026-02-02 10:08:20.494 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:21.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:21 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:23.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:23.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:23 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:24 np0005604791 nova_compute[226294]: 2026-02-02 10:08:24.025 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:24 np0005604791 ovn_controller[133666]: 2026-02-02T10:08:24Z|00049|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Feb  2 05:08:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:25.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:25.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.382 226298 DEBUG nova.compute.manager [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG nova.compute.manager [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.384 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:08:25 np0005604791 nova_compute[226294]: 2026-02-02 10:08:25.496 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:25 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001f10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:27 np0005604791 nova_compute[226294]: 2026-02-02 10:08:27.408 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:08:27 np0005604791 nova_compute[226294]: 2026-02-02 10:08:27.408 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:08:27 np0005604791 nova_compute[226294]: 2026-02-02 10:08:27.421 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:08:27 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:29 np0005604791 nova_compute[226294]: 2026-02-02 10:08:29.058 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:29.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:29 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:30 np0005604791 nova_compute[226294]: 2026-02-02 10:08:30.537 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:31.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:31 np0005604791 podman[232826]: 2026-02-02 10:08:31.642891158 +0000 UTC m=+0.090379023 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:08:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:33 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:08:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:33.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:33 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:34 np0005604791 nova_compute[226294]: 2026-02-02 10:08:34.066 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:35.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:35.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:35 np0005604791 nova_compute[226294]: 2026-02-02 10:08:35.591 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:35 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:37.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:37.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:37 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:08:39 np0005604791 nova_compute[226294]: 2026-02-02 10:08:39.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:39.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:39 np0005604791 podman[232938]: 2026-02-02 10:08:39.392508601 +0000 UTC m=+0.062843151 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Feb  2 05:08:39 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.405 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.434 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Triggering sync for uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.435 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.435 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:40 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.496 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:08:40 np0005604791 nova_compute[226294]: 2026-02-02 10:08:40.614 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:41.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:41 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:42 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:43.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:43.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:43 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:44 np0005604791 nova_compute[226294]: 2026-02-02 10:08:44.072 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:08:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:45.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:45.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:45 np0005604791 nova_compute[226294]: 2026-02-02 10:08:45.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:45 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:46 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:47.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:47.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:47 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:48 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:49 np0005604791 nova_compute[226294]: 2026-02-02 10:08:49.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:08:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:49.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:08:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:49.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:49 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b170 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:50 np0005604791 nova_compute[226294]: 2026-02-02 10:08:50.708 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:51.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:51.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:51 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:52 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:53.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:53.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:53 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:54 np0005604791 nova_compute[226294]: 2026-02-02 10:08:54.079 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:54 np0005604791 nova_compute[226294]: 2026-02-02 10:08:54.228 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:54 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:54.229 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:08:54 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:54.230 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:08:54 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:08:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:08:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:08:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:08:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:55.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:55.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:55 np0005604791 nova_compute[226294]: 2026-02-02 10:08:55.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:55 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:56 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:57.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:57.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.489 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.489 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.508 226298 DEBUG nova.objects.instance [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.535 226298 DEBUG nova.virt.libvirt.vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.535 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.536 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.540 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.542 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.545 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Attempting to detach device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.546 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] detach device xml: <interface type="ethernet">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <mac address="fa:16:3e:2f:49:24"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <model type="virtio"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <driver name="vhost" rx_queue_size="512"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <mtu size="1442"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <target dev="tapc66e0be1-d1"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </interface>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.552 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.555 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <name>instance-00000006</name>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <memory unit='KiB'>131072</memory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <currentMemory unit='KiB'>131072</currentMemory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <vcpu placement='static'>1</vcpu>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <resource>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <partition>/machine</partition>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </resource>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <sysinfo type='smbios'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='manufacturer'>RDO</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='product'>OpenStack Compute</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='family'>Virtual Machine</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <boot dev='hd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <smbios mode='sysinfo'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <vmcoreinfo state='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <cpu mode='custom' match='exact' check='full'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <model fallback='forbid'>EPYC-Rome</model>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <vendor>AMD</vendor>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='x2apic'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc-deadline'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='hypervisor'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc_adjust'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='spec-ctrl'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='stibp'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='cmp_legacy'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='overflow-recov'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='succor'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='ibrs'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='amd-ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='virt-ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='lbrv'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='tsc-scale'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='vmcb-clean'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='flushbyasid'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pause-filter'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pfthreshold'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svme-addr-chk'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='lfence-always-serializing'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='xsaves'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svm'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='topoext'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='npt'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='nrip-save'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <clock offset='utc'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='pit' tickpolicy='delay'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='rtc' tickpolicy='catchup'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='hpet' present='no'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_poweroff>destroy</on_poweroff>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_reboot>restart</on_reboot>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_crash>destroy</on_crash>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <disk type='network' device='disk'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='vda' bus='virtio'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='virtio-disk0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <disk type='network' device='cdrom'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='sda' bus='sata'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <readonly/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='sata0-0-0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='0' model='pcie-root'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pcie.0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='1' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='1' port='0x10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='2' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='2' port='0x11'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='3' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='3' port='0x12'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='4' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='4' port='0x13'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='5' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='5' port='0x14'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='6' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='6' port='0x15'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='7' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='7' port='0x16'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='8' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='8' port='0x17'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.8'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='9' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='9' port='0x18'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.9'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='10' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='10' port='0x19'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='11' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='11' port='0x1a'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.11'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='12' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='12' port='0x1b'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.12'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='13' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='13' port='0x1c'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.13'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='14' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='14' port='0x1d'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.14'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='15' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='15' port='0x1e'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.15'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='16' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='16' port='0x1f'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.16'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='17' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='17' port='0x20'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.17'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='18' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='18' port='0x21'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.18'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='19' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='19' port='0x22'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.19'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='20' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='20' port='0x23'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.20'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='21' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='21' port='0x24'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.21'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='22' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='22' port='0x25'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.22'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='23' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='23' port='0x26'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.23'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='24' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='24' port='0x27'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.24'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='25' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='25' port='0x28'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.25'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-pci-bridge'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.26'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='usb' index='0' model='piix3-uhci'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='usb'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='sata' index='0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='ide'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <interface type='ethernet'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mac address='fa:16:3e:85:9a:96'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='tap09a00258-4f'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model type='virtio'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='vhost' rx_queue_size='512'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mtu size='1442'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='net0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <interface type='ethernet'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mac address='fa:16:3e:2f:49:24'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='tapc66e0be1-d1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model type='virtio'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='vhost' rx_queue_size='512'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mtu size='1442'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='net1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <serial type='pty'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target type='isa-serial' port='0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <model name='isa-serial'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </target>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <console type='pty' tty='/dev/pts/0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target type='serial' port='0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </console>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='tablet' bus='usb'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='usb' bus='0' port='1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='mouse' bus='ps2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='keyboard' bus='ps2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <listen type='address' address='::0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <audio id='1' type='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model type='virtio' heads='1' primary='yes'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='video0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <watchdog model='itco' action='reset'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='watchdog0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </watchdog>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <memballoon model='virtio'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <stats period='10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='balloon0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <rng model='virtio'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <backend model='random'>/dev/urandom</backend>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='rng0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <label>+107:+107</label>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <imagelabel>+107:+107</imagelabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.556 226298 INFO nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully detached device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the persistent domain config.#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.557 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] (1/8): Attempting to detach device tapc66e0be1-d1 with device alias net1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.558 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] detach device xml: <interface type="ethernet">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <mac address="fa:16:3e:2f:49:24"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <model type="virtio"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <driver name="vhost" rx_queue_size="512"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <mtu size="1442"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <target dev="tapc66e0be1-d1"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </interface>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Feb  2 05:08:57 np0005604791 kernel: tapc66e0be1-d1 (unregistering): left promiscuous mode
Feb  2 05:08:57 np0005604791 NetworkManager[49055]: <info>  [1770026937.6762] device (tapc66e0be1-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.682 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 ovn_controller[133666]: 2026-02-02T10:08:57Z|00050|binding|INFO|Releasing lport c66e0be1-d166-4088-8ad8-baa84f3d032d from this chassis (sb_readonly=0)
Feb  2 05:08:57 np0005604791 ovn_controller[133666]: 2026-02-02T10:08:57Z|00051|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d down in Southbound
Feb  2 05:08:57 np0005604791 ovn_controller[133666]: 2026-02-02T10:08:57Z|00052|binding|INFO|Removing iface tapc66e0be1-d1 ovn-installed in OVS
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.686 226298 DEBUG nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Received event <DeviceRemovedEvent: 1770026937.6865523, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.690 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:49:24 10.100.0.18', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e125f54e-7556-49c5-8356-e7390df43c53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9d42b65-630e-4d58-b649-2acc01d097b4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=c66e0be1-d166-4088-8ad8-baa84f3d032d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.690 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Start waiting for the detach event from libvirt for device tapc66e0be1-d1 with device alias net1 for instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.691 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.691 143542 INFO neutron.agent.ovn.metadata.agent [-] Port c66e0be1-d166-4088-8ad8-baa84f3d032d in datapath e125f54e-7556-49c5-8356-e7390df43c53 unbound from our chassis#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.692 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e125f54e-7556-49c5-8356-e7390df43c53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.693 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c93a1ce9-0db9-459e-8786-29195437a201]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.694 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 namespace which is not needed anymore#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.697 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <name>instance-00000006</name>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <memory unit='KiB'>131072</memory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <currentMemory unit='KiB'>131072</currentMemory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <vcpu placement='static'>1</vcpu>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <resource>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <partition>/machine</partition>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </resource>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <sysinfo type='smbios'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='manufacturer'>RDO</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='product'>OpenStack Compute</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <entry name='family'>Virtual Machine</entry>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <boot dev='hd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <smbios mode='sysinfo'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <vmcoreinfo state='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <cpu mode='custom' match='exact' check='full'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <model fallback='forbid'>EPYC-Rome</model>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <vendor>AMD</vendor>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='x2apic'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc-deadline'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='hypervisor'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc_adjust'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='spec-ctrl'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='stibp'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='cmp_legacy'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='overflow-recov'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='succor'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='ibrs'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='amd-ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='virt-ssbd'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='lbrv'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='tsc-scale'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='vmcb-clean'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='flushbyasid'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pause-filter'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pfthreshold'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svme-addr-chk'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='lfence-always-serializing'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='xsaves'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svm'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='require' name='topoext'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='npt'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <feature policy='disable' name='nrip-save'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <clock offset='utc'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='pit' tickpolicy='delay'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='rtc' tickpolicy='catchup'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <timer name='hpet' present='no'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_poweroff>destroy</on_poweroff>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_reboot>restart</on_reboot>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <on_crash>destroy</on_crash>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <disk type='network' device='disk'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='vda' bus='virtio'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='virtio-disk0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <disk type='network' device='cdrom'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='sda' bus='sata'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <readonly/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='sata0-0-0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='0' model='pcie-root'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pcie.0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='1' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='1' port='0x10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='2' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='2' port='0x11'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='3' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='3' port='0x12'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='4' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='4' port='0x13'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='5' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='5' port='0x14'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='6' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='6' port='0x15'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='7' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='7' port='0x16'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='8' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='8' port='0x17'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.8'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='9' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='9' port='0x18'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.9'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='10' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='10' port='0x19'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='11' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='11' port='0x1a'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.11'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='12' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='12' port='0x1b'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.12'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='13' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='13' port='0x1c'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.13'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='14' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='14' port='0x1d'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.14'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='15' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='15' port='0x1e'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.15'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='16' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='16' port='0x1f'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.16'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='17' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='17' port='0x20'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.17'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='18' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='18' port='0x21'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.18'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='19' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='19' port='0x22'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.19'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='20' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='20' port='0x23'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.20'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='21' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='21' port='0x24'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.21'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='22' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='22' port='0x25'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.22'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='23' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='23' port='0x26'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.23'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='24' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='24' port='0x27'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.24'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='25' model='pcie-root-port'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target chassis='25' port='0x28'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.25'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model name='pcie-pci-bridge'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='pci.26'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='usb' index='0' model='piix3-uhci'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='usb'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <controller type='sata' index='0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='ide'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <interface type='ethernet'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mac address='fa:16:3e:85:9a:96'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target dev='tap09a00258-4f'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model type='virtio'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <driver name='vhost' rx_queue_size='512'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <mtu size='1442'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='net0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <serial type='pty'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target type='isa-serial' port='0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:        <model name='isa-serial'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      </target>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <console type='pty' tty='/dev/pts/0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <target type='serial' port='0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </console>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='tablet' bus='usb'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='usb' bus='0' port='1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='mouse' bus='ps2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input1'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <input type='keyboard' bus='ps2'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='input2'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <listen type='address' address='::0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <audio id='1' type='none'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <model type='virtio' heads='1' primary='yes'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='video0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <watchdog model='itco' action='reset'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='watchdog0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </watchdog>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <memballoon model='virtio'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <stats period='10'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='balloon0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <rng model='virtio'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <backend model='random'>/dev/urandom</backend>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <alias name='rng0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <label>+107:+107</label>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <imagelabel>+107:+107</imagelabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.697 226298 INFO nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully detached device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the live domain config.#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.699 226298 DEBUG nova.virt.libvirt.vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.700 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.701 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.702 226298 DEBUG os_vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.705 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.705 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc66e0be1-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.714 226298 INFO os_vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.715 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:57 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:57 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:57 np0005604791 nova_compute[226294]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : haproxy version is 2.8.14-c23fe91
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : path to executable is /usr/sbin/haproxy
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : Exiting Master process...
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : Exiting Master process...
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [ALERT]    (232617) : Current worker (232619) exited with code 143 (Terminated)
Feb  2 05:08:57 np0005604791 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : All workers exited. Exiting... (0)
Feb  2 05:08:57 np0005604791 systemd[1]: libpod-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope: Deactivated successfully.
Feb  2 05:08:57 np0005604791 podman[233021]: 2026-02-02 10:08:57.860983219 +0000 UTC m=+0.062057950 container died e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:08:57 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e-userdata-shm.mount: Deactivated successfully.
Feb  2 05:08:57 np0005604791 systemd[1]: var-lib-containers-storage-overlay-a7028958bbb9ac49e703bb1728fefda69b8f73736997e2045bf747f59bb53233-merged.mount: Deactivated successfully.
Feb  2 05:08:57 np0005604791 podman[233021]: 2026-02-02 10:08:57.908790769 +0000 UTC m=+0.109865450 container cleanup e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 05:08:57 np0005604791 systemd[1]: libpod-conmon-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope: Deactivated successfully.
Feb  2 05:08:57 np0005604791 podman[233052]: 2026-02-02 10:08:57.966918853 +0000 UTC m=+0.039744507 container remove e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.971 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4c49446b-dcf5-4302-bce9-72f0d53822a5]: (4, ('Mon Feb  2 10:08:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 (e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e)\ne67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e\nMon Feb  2 10:08:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 (e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e)\ne67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.973 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6cfda328-e2a2-4fad-8174-779a6b2f1a36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.974 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape125f54e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.975 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 kernel: tape125f54e-70: left promiscuous mode
Feb  2 05:08:57 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:57 np0005604791 nova_compute[226294]: 2026-02-02 10:08:57.985 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.988 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1e7c1b-9377-4464-aafc-2717394ce588]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.002 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[625d9a8a-a61c-41be-bf2f-cc2c98bca11c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.004 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[81af7ca5-486e-4116-8fe4-4f72703f4c1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.016 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[aedc4d99-3611-4aed-b785-9fc8d4c019e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402123, 'reachable_time': 19451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233067, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.019 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Feb  2 05:08:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.019 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf5be8e-1268-4661-86f3-383ab39319fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:08:58 np0005604791 systemd[1]: run-netns-ovnmeta\x2de125f54e\x2d7556\x2d49c5\x2d8356\x2de7390df43c53.mount: Deactivated successfully.
Feb  2 05:08:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.213 226298 DEBUG nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.215 226298 DEBUG nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.215 226298 WARNING nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.481 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.482 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.482 226298 DEBUG nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Feb  2 05:08:58 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 DEBUG nova.compute.manager [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-deleted-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 INFO nova.compute.manager [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Neutron deleted interface c66e0be1-d166-4088-8ad8-baa84f3d032d; detaching it from the instance and deleting it from the info cache#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 DEBUG nova.network.neutron [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.545 226298 DEBUG nova.objects.instance [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lazy-loading 'system_metadata' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.598 226298 DEBUG nova.objects.instance [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.635 226298 DEBUG nova.virt.libvirt.vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.636 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.637 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.642 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.647 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <name>instance-00000006</name>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <memory unit='KiB'>131072</memory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <currentMemory unit='KiB'>131072</currentMemory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <vcpu placement='static'>1</vcpu>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <resource>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <partition>/machine</partition>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </resource>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <sysinfo type='smbios'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='manufacturer'>RDO</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='product'>OpenStack Compute</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='family'>Virtual Machine</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:08:58 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <boot dev='hd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <smbios mode='sysinfo'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <vmcoreinfo state='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <cpu mode='custom' match='exact' check='full'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <model fallback='forbid'>EPYC-Rome</model>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <vendor>AMD</vendor>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='x2apic'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc-deadline'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='hypervisor'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc_adjust'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='spec-ctrl'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='stibp'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='cmp_legacy'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='overflow-recov'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='succor'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='ibrs'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='amd-ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='virt-ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='lbrv'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='tsc-scale'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='vmcb-clean'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='flushbyasid'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pause-filter'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pfthreshold'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svme-addr-chk'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='lfence-always-serializing'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='xsaves'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svm'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='topoext'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='npt'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='nrip-save'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <clock offset='utc'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='pit' tickpolicy='delay'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='rtc' tickpolicy='catchup'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='hpet' present='no'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_poweroff>destroy</on_poweroff>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_reboot>restart</on_reboot>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_crash>destroy</on_crash>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <disk type='network' device='disk'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='vda' bus='virtio'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='virtio-disk0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <disk type='network' device='cdrom'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='sda' bus='sata'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <readonly/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='sata0-0-0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='0' model='pcie-root'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pcie.0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='1' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='1' port='0x10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='2' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='2' port='0x11'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='3' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='3' port='0x12'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='4' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='4' port='0x13'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='5' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='5' port='0x14'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='6' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='6' port='0x15'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='7' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='7' port='0x16'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='8' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='8' port='0x17'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.8'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='9' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='9' port='0x18'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.9'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='10' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='10' port='0x19'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='11' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='11' port='0x1a'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.11'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='12' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='12' port='0x1b'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.12'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='13' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='13' port='0x1c'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.13'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='14' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='14' port='0x1d'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.14'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='15' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='15' port='0x1e'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.15'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='16' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='16' port='0x1f'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.16'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='17' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='17' port='0x20'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.17'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='18' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='18' port='0x21'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.18'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='19' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='19' port='0x22'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.19'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='20' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='20' port='0x23'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.20'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='21' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='21' port='0x24'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.21'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='22' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='22' port='0x25'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.22'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='23' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='23' port='0x26'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.23'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='24' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='24' port='0x27'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.24'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='25' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='25' port='0x28'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.25'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-pci-bridge'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.26'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='usb' index='0' model='piix3-uhci'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='usb'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='sata' index='0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='ide'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <interface type='ethernet'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <mac address='fa:16:3e:85:9a:96'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='tap09a00258-4f'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model type='virtio'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='vhost' rx_queue_size='512'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <mtu size='1442'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='net0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <serial type='pty'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target type='isa-serial' port='0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <model name='isa-serial'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </target>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <console type='pty' tty='/dev/pts/0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target type='serial' port='0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </console>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='tablet' bus='usb'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='usb' bus='0' port='1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='mouse' bus='ps2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='keyboard' bus='ps2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <listen type='address' address='::0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <audio id='1' type='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model type='virtio' heads='1' primary='yes'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='video0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <watchdog model='itco' action='reset'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='watchdog0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </watchdog>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <memballoon model='virtio'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <stats period='10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='balloon0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <rng model='virtio'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <backend model='random'>/dev/urandom</backend>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='rng0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <label>+107:+107</label>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <imagelabel>+107:+107</imagelabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.648 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.653 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <name>instance-00000006</name>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <memory unit='KiB'>131072</memory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <currentMemory unit='KiB'>131072</currentMemory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <vcpu placement='static'>1</vcpu>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <resource>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <partition>/machine</partition>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </resource>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <sysinfo type='smbios'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='manufacturer'>RDO</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='product'>OpenStack Compute</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <entry name='family'>Virtual Machine</entry>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <boot dev='hd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <smbios mode='sysinfo'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <vmcoreinfo state='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <cpu mode='custom' match='exact' check='full'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <model fallback='forbid'>EPYC-Rome</model>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <vendor>AMD</vendor>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='x2apic'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc-deadline'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='hypervisor'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='tsc_adjust'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='spec-ctrl'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='stibp'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='cmp_legacy'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='overflow-recov'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='succor'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='ibrs'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='amd-ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='virt-ssbd'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='lbrv'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='tsc-scale'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='vmcb-clean'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='flushbyasid'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pause-filter'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='pfthreshold'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svme-addr-chk'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='lfence-always-serializing'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='xsaves'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='svm'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='require' name='topoext'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='npt'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <feature policy='disable' name='nrip-save'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <clock offset='utc'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='pit' tickpolicy='delay'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='rtc' tickpolicy='catchup'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <timer name='hpet' present='no'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_poweroff>destroy</on_poweroff>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_reboot>restart</on_reboot>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <on_crash>destroy</on_crash>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <disk type='network' device='disk'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='vda' bus='virtio'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='virtio-disk0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <disk type='network' device='cdrom'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='qemu' type='raw' cache='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <auth username='openstack'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.100' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.102' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <host name='192.168.122.101' port='6789'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='sda' bus='sata'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <readonly/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='sata0-0-0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='0' model='pcie-root'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pcie.0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='1' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='1' port='0x10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='2' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='2' port='0x11'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='3' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='3' port='0x12'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='4' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='4' port='0x13'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='5' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='5' port='0x14'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='6' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='6' port='0x15'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='7' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='7' port='0x16'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='8' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='8' port='0x17'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.8'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='9' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='9' port='0x18'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.9'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='10' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='10' port='0x19'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='11' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='11' port='0x1a'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.11'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='12' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='12' port='0x1b'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.12'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='13' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='13' port='0x1c'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.13'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='14' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='14' port='0x1d'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.14'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='15' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='15' port='0x1e'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.15'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='16' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='16' port='0x1f'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.16'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='17' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='17' port='0x20'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.17'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='18' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='18' port='0x21'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.18'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='19' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='19' port='0x22'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.19'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='20' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='20' port='0x23'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.20'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='21' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='21' port='0x24'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.21'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='22' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='22' port='0x25'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.22'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='23' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='23' port='0x26'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.23'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='24' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='24' port='0x27'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.24'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='25' model='pcie-root-port'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-root-port'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target chassis='25' port='0x28'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.25'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model name='pcie-pci-bridge'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='pci.26'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='usb' index='0' model='piix3-uhci'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='usb'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <controller type='sata' index='0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='ide'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </controller>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <interface type='ethernet'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <mac address='fa:16:3e:85:9a:96'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target dev='tap09a00258-4f'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model type='virtio'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <driver name='vhost' rx_queue_size='512'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <mtu size='1442'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='net0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <serial type='pty'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target type='isa-serial' port='0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:        <model name='isa-serial'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      </target>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <console type='pty' tty='/dev/pts/0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <source path='/dev/pts/0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <target type='serial' port='0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='serial0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </console>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='tablet' bus='usb'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='usb' bus='0' port='1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='mouse' bus='ps2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input1'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <input type='keyboard' bus='ps2'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='input2'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </input>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <listen type='address' address='::0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </graphics>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <audio id='1' type='none'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <model type='virtio' heads='1' primary='yes'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='video0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <watchdog model='itco' action='reset'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='watchdog0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </watchdog>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <memballoon model='virtio'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <stats period='10'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='balloon0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <rng model='virtio'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <backend model='random'>/dev/urandom</backend>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <alias name='rng0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <label>+107:+107</label>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <imagelabel>+107:+107</imagelabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </seclabel>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.654 226298 WARNING nova.virt.libvirt.driver [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Detaching interface fa:16:3e:2f:49:24 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapc66e0be1-d1' not found.#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.655 226298 DEBUG nova.virt.libvirt.vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.656 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.657 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.658 226298 DEBUG os_vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc66e0be1-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.666 226298 INFO os_vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')#033[00m
Feb  2 05:08:58 np0005604791 nova_compute[226294]: 2026-02-02 10:08:58.667 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:creationTime>2026-02-02 10:08:58</nova:creationTime>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:flavor name="m1.nano">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:memory>128</nova:memory>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:disk>1</nova:disk>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:swap>0</nova:swap>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:vcpus>1</nova:vcpus>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:flavor>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:owner>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  <nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb  2 05:08:58 np0005604791 nova_compute[226294]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:    </nova:port>
Feb  2 05:08:58 np0005604791 nova_compute[226294]:  </nova:ports>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: </nova:instance>
Feb  2 05:08:58 np0005604791 nova_compute[226294]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Feb  2 05:08:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:08:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:08:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:08:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:08:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:59.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:08:59 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 DEBUG nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 WARNING nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.342 226298 INFO nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Port c66e0be1-d166-4088-8ad8-baa84f3d032d from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.342 226298 DEBUG nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.358 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.391 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:00 np0005604791 ovn_controller[133666]: 2026-02-02T10:09:00Z|00053|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.502 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:00 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:00 np0005604791 nova_compute[226294]: 2026-02-02 10:09:00.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:01.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:01.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.699 226298 DEBUG nova.compute.manager [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.699 226298 DEBUG nova.compute.manager [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.785 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.787 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.789 226298 INFO nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Terminating instance#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.790 226298 DEBUG nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Feb  2 05:09:01 np0005604791 kernel: tap09a00258-4f (unregistering): left promiscuous mode
Feb  2 05:09:01 np0005604791 NetworkManager[49055]: <info>  [1770026941.8556] device (tap09a00258-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.863 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:01 np0005604791 ovn_controller[133666]: 2026-02-02T10:09:01Z|00054|binding|INFO|Releasing lport 09a00258-4f60-42dd-a769-b2ea3b870187 from this chassis (sb_readonly=0)
Feb  2 05:09:01 np0005604791 ovn_controller[133666]: 2026-02-02T10:09:01Z|00055|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 down in Southbound
Feb  2 05:09:01 np0005604791 ovn_controller[133666]: 2026-02-02T10:09:01Z|00056|binding|INFO|Removing iface tap09a00258-4f ovn-installed in OVS
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.865 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:01 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.873 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:9a:96 10.100.0.10'], port_security=['fa:16:3e:85:9a:96 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09104532-215f-4de3-9920-7fd818e6c676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=755f8a60-018a-461f-bb4b-b9017895ccf7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=09a00258-4f60-42dd-a769-b2ea3b870187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:09:01 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.875 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 09a00258-4f60-42dd-a769-b2ea3b870187 in datapath ba6c4c87-77a9-4fcc-aa14-a4637c78f692 unbound from our chassis#033[00m
Feb  2 05:09:01 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.877 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Feb  2 05:09:01 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.878 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b0ac4a-c5c1-449e-88c7-81aa4dda2f80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:01 np0005604791 nova_compute[226294]: 2026-02-02 10:09:01.879 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:01 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.881 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 namespace which is not needed anymore#033[00m
Feb  2 05:09:01 np0005604791 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb  2 05:09:01 np0005604791 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 17.991s CPU time.
Feb  2 05:09:01 np0005604791 systemd-machined[195072]: Machine qemu-2-instance-00000006 terminated.
Feb  2 05:09:01 np0005604791 podman[233097]: 2026-02-02 10:09:01.967594008 +0000 UTC m=+0.084438955 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 05:09:01 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : haproxy version is 2.8.14-c23fe91
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : path to executable is /usr/sbin/haproxy
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : Exiting Master process...
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : Exiting Master process...
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [ALERT]    (232207) : Current worker (232217) exited with code 143 (Terminated)
Feb  2 05:09:01 np0005604791 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : All workers exited. Exiting... (0)
Feb  2 05:09:01 np0005604791 systemd[1]: libpod-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope: Deactivated successfully.
Feb  2 05:09:02 np0005604791 podman[233145]: 2026-02-02 10:09:02.00454058 +0000 UTC m=+0.045054829 container died 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 05:09:02 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81-userdata-shm.mount: Deactivated successfully.
Feb  2 05:09:02 np0005604791 systemd[1]: var-lib-containers-storage-overlay-b7e71b523a6cf72f6079510db5422c0e2666a6b8442a4c07506d8ee1c5789881-merged.mount: Deactivated successfully.
Feb  2 05:09:02 np0005604791 podman[233145]: 2026-02-02 10:09:02.048198729 +0000 UTC m=+0.088712968 container cleanup 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.046 226298 INFO nova.virt.libvirt.driver [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance destroyed successfully.#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.048 226298 DEBUG nova.objects.instance [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:09:02 np0005604791 systemd[1]: libpod-conmon-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope: Deactivated successfully.
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.063 226298 DEBUG nova.virt.libvirt.vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.063 226298 DEBUG nova.network.os_vif_util [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.065 226298 DEBUG nova.network.os_vif_util [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.065 226298 DEBUG os_vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.070 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09a00258-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:09:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.073 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.076 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.081 226298 INFO os_vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f')#033[00m
Feb  2 05:09:02 np0005604791 podman[233190]: 2026-02-02 10:09:02.112002725 +0000 UTC m=+0.045071609 container remove 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.117 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5e989ec3-a3db-4586-8b88-cab6e49c8513]: (4, ('Mon Feb  2 10:09:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 (2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81)\n2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81\nMon Feb  2 10:09:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 (2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81)\n2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.118 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e49fcec6-e611-43c7-8528-604f3ac06b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.119 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba6c4c87-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:02 np0005604791 kernel: tapba6c4c87-70: left promiscuous mode
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.129 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[04cba2ad-3bf2-4d91-a918-6d43504b7089]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.146 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[cecf340c-9100-411b-9ef5-e67eda71bc47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.147 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d3a369-b465-4324-ba52-3ee8c9f9c962]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.161 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[fd92aaab-3cba-40b1-96d3-be2ba50cd190]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399078, 'reachable_time': 35155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233221, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 systemd[1]: run-netns-ovnmeta\x2dba6c4c87\x2d77a9\x2d4fcc\x2daa14\x2da4637c78f692.mount: Deactivated successfully.
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.163 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Feb  2 05:09:02 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.164 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc6b143-94d2-484e-b114-1b843a192a83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.377 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.378 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 WARNING nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with vm_state active and task_state deleting.#033[00m
Feb  2 05:09:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:02 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.513 226298 INFO nova.virt.libvirt.driver [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deleting instance files /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_del#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.514 226298 INFO nova.virt.libvirt.driver [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deletion of /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_del complete#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.559 226298 INFO nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG oslo.service.loopingcall [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG nova.network.neutron [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.712 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.712 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:09:02 np0005604791 nova_compute[226294]: 2026-02-02 10:09:02.732 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:09:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:03.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:03.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.635 226298 DEBUG nova.network.neutron [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.650 226298 INFO nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 1.09 seconds to deallocate network for instance.#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.709 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.710 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.744 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.770 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.771 226298 DEBUG nova.compute.provider_tree [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.789 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.809 226298 DEBUG nova.compute.manager [req-152ddc3c-7b1d-4940-af10-4df5772e9263 req-ba355b06-d9da-45f7-a2b5-69a117efce9d b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-deleted-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.814 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Feb  2 05:09:03 np0005604791 nova_compute[226294]: 2026-02-02 10:09:03.861 226298 DEBUG oslo_concurrency.processutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:09:03 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:04 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:04.233 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:09:04 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:09:04 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2825372233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.333 226298 DEBUG oslo_concurrency.processutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.340 226298 DEBUG nova.compute.provider_tree [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.356 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.379 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.432 226298 INFO nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220#033[00m
Feb  2 05:09:04 np0005604791 nova_compute[226294]: 2026-02-02 10:09:04.509 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:04 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:05.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:05 np0005604791 nova_compute[226294]: 2026-02-02 10:09:05.761 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:05 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:06 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:07 np0005604791 nova_compute[226294]: 2026-02-02 10:09:07.072 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:07 np0005604791 nova_compute[226294]: 2026-02-02 10:09:07.773 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:07 np0005604791 nova_compute[226294]: 2026-02-02 10:09:07.798 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:07 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:08 np0005604791 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb  2 05:09:08 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:09:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s#012Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb  2 05:09:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:09.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:09 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:10 np0005604791 podman[233251]: 2026-02-02 10:09:10.39647417 +0000 UTC m=+0.068073840 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb  2 05:09:10 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:10 np0005604791 nova_compute[226294]: 2026-02-02 10:09:10.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:10 np0005604791 nova_compute[226294]: 2026-02-02 10:09:10.804 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:11.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:11.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.676 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:11 np0005604791 nova_compute[226294]: 2026-02-02 10:09:11.678 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:09:11 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:12 np0005604791 nova_compute[226294]: 2026-02-02 10:09:12.109 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:12 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:12 np0005604791 nova_compute[226294]: 2026-02-02 10:09:12.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:13.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:13.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:13 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:14 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:09:14 np0005604791 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:09:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:09:15 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3180929845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.149 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:09:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:15.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:15.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.346 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.348 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4891MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.425 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.426 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.447 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:15 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:09:15 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/422795760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.941 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.948 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.967 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.994 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:09:15 np0005604791 nova_compute[226294]: 2026-02-02 10:09:15.995 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:15 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:16 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:17 np0005604791 nova_compute[226294]: 2026-02-02 10:09:17.032 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770026942.0311568, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:09:17 np0005604791 nova_compute[226294]: 2026-02-02 10:09:17.033 226298 INFO nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Stopped (Lifecycle Event)#033[00m
Feb  2 05:09:17 np0005604791 nova_compute[226294]: 2026-02-02 10:09:17.051 226298 DEBUG nova.compute.manager [None req-63b1e011-ab2b-41ab-922b-4a317db92e93 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:09:17 np0005604791 nova_compute[226294]: 2026-02-02 10:09:17.112 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:17.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:17.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:18 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:19.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:19.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:20 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:20 np0005604791 nova_compute[226294]: 2026-02-02 10:09:20.812 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:21.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:21.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.448649) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961448736, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2337, "num_deletes": 251, "total_data_size": 5842028, "memory_usage": 5928096, "flush_reason": "Manual Compaction"}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961487343, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3781117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26072, "largest_seqno": 28404, "table_properties": {"data_size": 3772144, "index_size": 5467, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19281, "raw_average_key_size": 20, "raw_value_size": 3753887, "raw_average_value_size": 3918, "num_data_blocks": 242, "num_entries": 958, "num_filter_entries": 958, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026763, "oldest_key_time": 1770026763, "file_creation_time": 1770026961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 38746 microseconds, and 8936 cpu microseconds.
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.487407) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3781117 bytes OK
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.487434) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.489951) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.489980) EVENT_LOG_v1 {"time_micros": 1770026961489972, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.490005) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5831682, prev total WAL file size 5831682, number of live WAL files 2.
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.491349) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3692KB)], [51(11MB)]
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961491413, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16145690, "oldest_snapshot_seqno": -1}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5849 keys, 13998793 bytes, temperature: kUnknown
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961637830, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 13998793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13959385, "index_size": 23682, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14661, "raw_key_size": 148732, "raw_average_key_size": 25, "raw_value_size": 13853557, "raw_average_value_size": 2368, "num_data_blocks": 966, "num_entries": 5849, "num_filter_entries": 5849, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.638190) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 13998793 bytes
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.639560) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.2 rd, 95.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.8 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 6365, records dropped: 516 output_compression: NoCompression
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.639589) EVENT_LOG_v1 {"time_micros": 1770026961639575, "job": 30, "event": "compaction_finished", "compaction_time_micros": 146512, "compaction_time_cpu_micros": 34138, "output_level": 6, "num_output_files": 1, "total_output_size": 13998793, "num_input_records": 6365, "num_output_records": 5849, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961640389, "job": 30, "event": "table_file_deletion", "file_number": 53}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961642408, "job": 30, "event": "table_file_deletion", "file_number": 51}
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.491219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:21 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:09:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001ba0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:22 np0005604791 nova_compute[226294]: 2026-02-02 10:09:22.154 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:22 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:23.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:23.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001ba0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:24 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:09:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:25.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:09:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:25 np0005604791 nova_compute[226294]: 2026-02-02 10:09:25.815 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:26 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001d40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:27 np0005604791 nova_compute[226294]: 2026-02-02 10:09:27.182 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:27.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:27.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:28 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:29.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:29.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001d40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:30 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:30 np0005604791 nova_compute[226294]: 2026-02-02 10:09:30.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:31.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:31.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:32 np0005604791 nova_compute[226294]: 2026-02-02 10:09:32.184 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:32 np0005604791 podman[233354]: 2026-02-02 10:09:32.468188837 +0000 UTC m=+0.130979821 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb  2 05:09:32 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:33.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:34 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:09:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5356 writes, 28K keys, 5356 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5356 writes, 5356 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1530 writes, 7398 keys, 1530 commit groups, 1.0 writes per commit group, ingest: 16.91 MB, 0.03 MB/s#012Interval WAL: 1530 writes, 1530 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    109.1      0.40              0.11        15    0.027       0      0       0.0       0.0#012  L6      1/0   13.35 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.0    119.4    101.8      1.73              0.40        14    0.123     74K   7395       0.0       0.0#012 Sum      1/0   13.35 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.0     96.9    103.2      2.13              0.51        29    0.073     74K   7395       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.8     99.4    101.1      0.73              0.18        10    0.073     30K   2565       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    119.4    101.8      1.73              0.40        14    0.123     74K   7395       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    109.6      0.40              0.11        14    0.028       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.043, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.21 GB write, 0.12 MB/s write, 0.20 GB read, 0.11 MB/s read, 2.1 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 17.54 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.00019 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(926,16.95 MB,5.57595%) FilterBlock(29,220.61 KB,0.070868%) IndexBlock(29,383.73 KB,0.12327%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Feb  2 05:09:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:35.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:35.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:35 np0005604791 nova_compute[226294]: 2026-02-02 10:09:35.841 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:36 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:37 np0005604791 nova_compute[226294]: 2026-02-02 10:09:37.213 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:37.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:37.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb  2 05:09:38 np0005604791 kernel: ganesha.nfsd[232989]: segfault at 50 ip 00007f8543c9b32e sp 00007f84a77fd210 error 4 in libntirpc.so.5.8[7f8543c80000+2c000] likely on CPU 1 (core 0, socket 1)
Feb  2 05:09:38 np0005604791 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb  2 05:09:38 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy ignored for local
Feb  2 05:09:38 np0005604791 systemd[1]: Started Process Core Dump (PID 233434/UID 0).
Feb  2 05:09:39 np0005604791 systemd-coredump[233437]: Process 229173 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 81:#012#0  0x00007f8543c9b32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Feb  2 05:09:39 np0005604791 systemd[1]: systemd-coredump@13-233434-0.service: Deactivated successfully.
Feb  2 05:09:39 np0005604791 podman[233595]: 2026-02-02 10:09:39.141565911 +0000 UTC m=+0.039461879 container died 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb  2 05:09:39 np0005604791 systemd[1]: var-lib-containers-storage-overlay-aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc-merged.mount: Deactivated successfully.
Feb  2 05:09:39 np0005604791 podman[233595]: 2026-02-02 10:09:39.180658599 +0000 UTC m=+0.078554497 container remove 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:09:39 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb  2 05:09:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:39.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:39 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:39.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:39 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 05:09:39 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.782s CPU time.
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.481740535 +0000 UTC m=+0.064145684 container create 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:09:39 np0005604791 systemd[1]: Started libpod-conmon-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope.
Feb  2 05:09:39 np0005604791 systemd[1]: Started libcrun container.
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.458756345 +0000 UTC m=+0.041161504 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.561228826 +0000 UTC m=+0.143634025 container init 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.571509679 +0000 UTC m=+0.153914828 container start 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.57642783 +0000 UTC m=+0.158833039 container attach 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:09:39 np0005604791 beautiful_chatterjee[233694]: 167 167
Feb  2 05:09:39 np0005604791 systemd[1]: libpod-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope: Deactivated successfully.
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.580340854 +0000 UTC m=+0.162745973 container died 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:09:39 np0005604791 systemd[1]: var-lib-containers-storage-overlay-57ffeb4ee93a5dee9150cae19b2072c7324f0e57bb2e2112cb59a9de266a868e-merged.mount: Deactivated successfully.
Feb  2 05:09:39 np0005604791 podman[233678]: 2026-02-02 10:09:39.622508144 +0000 UTC m=+0.204913293 container remove 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Feb  2 05:09:39 np0005604791 systemd[1]: libpod-conmon-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope: Deactivated successfully.
Feb  2 05:09:39 np0005604791 podman[233719]: 2026-02-02 10:09:39.785529944 +0000 UTC m=+0.060399716 container create 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Feb  2 05:09:39 np0005604791 systemd[1]: Started libpod-conmon-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope.
Feb  2 05:09:39 np0005604791 podman[233719]: 2026-02-02 10:09:39.75827999 +0000 UTC m=+0.033149792 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb  2 05:09:39 np0005604791 systemd[1]: Started libcrun container.
Feb  2 05:09:39 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb  2 05:09:39 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb  2 05:09:39 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb  2 05:09:39 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb  2 05:09:39 np0005604791 podman[233719]: 2026-02-02 10:09:39.87727417 +0000 UTC m=+0.152143902 container init 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb  2 05:09:39 np0005604791 podman[233719]: 2026-02-02 10:09:39.886535256 +0000 UTC m=+0.161405008 container start 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb  2 05:09:39 np0005604791 podman[233719]: 2026-02-02 10:09:39.893927452 +0000 UTC m=+0.168797264 container attach 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]: [
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:    {
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "available": false,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "being_replaced": false,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "ceph_device_lvm": false,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "device_id": "QEMU_DVD-ROM_QM00001",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "lsm_data": {},
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "lvs": [],
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "path": "/dev/sr0",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "rejected_reasons": [
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "Has a FileSystem",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "Insufficient space (<5GB)"
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        ],
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        "sys_api": {
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "actuators": null,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "device_nodes": [
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:                "sr0"
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            ],
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "devname": "sr0",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "human_readable_size": "482.00 KB",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "id_bus": "ata",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "model": "QEMU DVD-ROM",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "nr_requests": "2",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "parent": "/dev/sr0",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "partitions": {},
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "path": "/dev/sr0",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "removable": "1",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "rev": "2.5+",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "ro": "0",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "rotational": "1",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "sas_address": "",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "sas_device_handle": "",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "scheduler_mode": "mq-deadline",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "sectors": 0,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "sectorsize": "2048",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "size": 493568.0,
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "support_discard": "2048",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "type": "disk",
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:            "vendor": "QEMU"
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:        }
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]:    }
Feb  2 05:09:40 np0005604791 priceless_cannon[233735]: ]
Feb  2 05:09:40 np0005604791 systemd[1]: libpod-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope: Deactivated successfully.
Feb  2 05:09:40 np0005604791 podman[233719]: 2026-02-02 10:09:40.594767226 +0000 UTC m=+0.869636948 container died 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Feb  2 05:09:40 np0005604791 systemd[1]: var-lib-containers-storage-overlay-62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542-merged.mount: Deactivated successfully.
Feb  2 05:09:40 np0005604791 podman[233719]: 2026-02-02 10:09:40.63972578 +0000 UTC m=+0.914595512 container remove 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:09:40 np0005604791 systemd[1]: libpod-conmon-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope: Deactivated successfully.
Feb  2 05:09:40 np0005604791 podman[234850]: 2026-02-02 10:09:40.66683068 +0000 UTC m=+0.051731835 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb  2 05:09:40 np0005604791 nova_compute[226294]: 2026-02-02 10:09:40.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:41.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:41.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:41 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:09:42 np0005604791 nova_compute[226294]: 2026-02-02 10:09:42.248 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:42 np0005604791 ovn_controller[133666]: 2026-02-02T10:09:42Z|00057|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb  2 05:09:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:43.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:43.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:44 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100944 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:09:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:09:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:09:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:09:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:45.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:45.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:45 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:45 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:09:45 np0005604791 nova_compute[226294]: 2026-02-02 10:09:45.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:47 np0005604791 nova_compute[226294]: 2026-02-02 10:09:47.251 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:47.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:47.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:49.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:49.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:49 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 14.
Feb  2 05:09:49 np0005604791 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 05:09:49 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.782s CPU time.
Feb  2 05:09:49 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Start request repeated too quickly.
Feb  2 05:09:49 np0005604791 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb  2 05:09:49 np0005604791 systemd[1]: Failed to start Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb  2 05:09:50 np0005604791 nova_compute[226294]: 2026-02-02 10:09:50.924 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:51.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:51.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:52 np0005604791 nova_compute[226294]: 2026-02-02 10:09:52.283 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:53.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:55 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:55.066 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:09:55 np0005604791 nova_compute[226294]: 2026-02-02 10:09:55.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:55 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:09:55.068 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:09:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:55.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:09:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:55.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:09:55 np0005604791 nova_compute[226294]: 2026-02-02 10:09:55.970 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:57.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:09:57 np0005604791 nova_compute[226294]: 2026-02-02 10:09:57.323 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:09:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:57.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:09:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:09:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:59.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:09:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:09:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:09:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:59.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:00 np0005604791 ceph-mon[80115]: overall HEALTH_OK
Feb  2 05:10:01 np0005604791 nova_compute[226294]: 2026-02-02 10:10:01.010 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:01.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:01.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:02 np0005604791 nova_compute[226294]: 2026-02-02 10:10:02.371 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:03.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:03 np0005604791 podman[234941]: 2026-02-02 10:10:03.481007935 +0000 UTC m=+0.151066183 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 05:10:05 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:05.071 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:10:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:05.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:06 np0005604791 nova_compute[226294]: 2026-02-02 10:10:06.011 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:07.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:07.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:07 np0005604791 nova_compute[226294]: 2026-02-02 10:10:07.404 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:09.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:09.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:11 np0005604791 nova_compute[226294]: 2026-02-02 10:10:11.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:11.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:11.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:11 np0005604791 podman[234971]: 2026-02-02 10:10:11.416926501 +0000 UTC m=+0.088977564 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb  2 05:10:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:12 np0005604791 nova_compute[226294]: 2026-02-02 10:10:12.406 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:12 np0005604791 nova_compute[226294]: 2026-02-02 10:10:12.995 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.014 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.015 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.015 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.028 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.029 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.029 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.030 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.030 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:13.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:13.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:13 np0005604791 nova_compute[226294]: 2026-02-02 10:10:13.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:10:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:15.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:15.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:15 np0005604791 nova_compute[226294]: 2026-02-02 10:10:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.061 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:10:16 np0005604791 nova_compute[226294]: 2026-02-02 10:10:16.678 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:10:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:10:17 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/975664394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.123 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.293 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.294 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4888MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.295 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.295 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:10:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:17.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.374 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.374 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:10:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:17.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.394 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:10:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.407 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:10:17 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/293474388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.848 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.854 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.871 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.874 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:10:17 np0005604791 nova_compute[226294]: 2026-02-02 10:10:17.874 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:10:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:19.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:19.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:21 np0005604791 nova_compute[226294]: 2026-02-02 10:10:21.064 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:21.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:21.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:22 np0005604791 nova_compute[226294]: 2026-02-02 10:10:22.445 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:23.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:25.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:25.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:26 np0005604791 nova_compute[226294]: 2026-02-02 10:10:26.066 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:10:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:27.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:10:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:27.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:27 np0005604791 nova_compute[226294]: 2026-02-02 10:10:27.495 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:29.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:29.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:31 np0005604791 nova_compute[226294]: 2026-02-02 10:10:31.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:31.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:31.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:32 np0005604791 nova_compute[226294]: 2026-02-02 10:10:32.528 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:33.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:34 np0005604791 podman[235070]: 2026-02-02 10:10:34.44619357 +0000 UTC m=+0.120142002 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb  2 05:10:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:35.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:35.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:36 np0005604791 nova_compute[226294]: 2026-02-02 10:10:36.088 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:37.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:37 np0005604791 nova_compute[226294]: 2026-02-02 10:10:37.531 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:39.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:41 np0005604791 nova_compute[226294]: 2026-02-02 10:10:41.113 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:41.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:42 np0005604791 podman[235125]: 2026-02-02 10:10:42.409207376 +0000 UTC m=+0.086198870 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb  2 05:10:42 np0005604791 nova_compute[226294]: 2026-02-02 10:10:42.560 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:43.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:43.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:10:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:10:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:10:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:45.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:45.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:46 np0005604791 nova_compute[226294]: 2026-02-02 10:10:46.150 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:47.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:47.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:47 np0005604791 nova_compute[226294]: 2026-02-02 10:10:47.562 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:10:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:49.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:51 np0005604791 nova_compute[226294]: 2026-02-02 10:10:51.204 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:51.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:51.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:52 np0005604791 nova_compute[226294]: 2026-02-02 10:10:52.591 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:53.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:10:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:53.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:10:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:10:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:55.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:10:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:55.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:10:55 np0005604791 nova_compute[226294]: 2026-02-02 10:10:55.520 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:55 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:55.521 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:10:55 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:55.523 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:10:56 np0005604791 nova_compute[226294]: 2026-02-02 10:10:56.229 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:57.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:10:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:57.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:57 np0005604791 nova_compute[226294]: 2026-02-02 10:10:57.625 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:10:58 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:10:58.525 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:10:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:59.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:10:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:10:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:10:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:59.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:01 np0005604791 nova_compute[226294]: 2026-02-02 10:11:01.275 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:01.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:01.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:02 np0005604791 nova_compute[226294]: 2026-02-02 10:11:02.627 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.048 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.049 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.068 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.162 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.162 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.171 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.172 226298 INFO nova.compute.claims [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Claim successful on node compute-1.ctlplane.example.com#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.270 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:03.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:03.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:03 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:11:03 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/20677095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.757 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.765 226298 DEBUG nova.compute.provider_tree [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.813 226298 DEBUG nova.scheduler.client.report [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.858 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.859 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.925 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.926 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.959 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Feb  2 05:11:03 np0005604791 nova_compute[226294]: 2026-02-02 10:11:03.978 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.077 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.078 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.079 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating image(s)#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.102 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.126 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.154 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.157 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.211 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.212 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.213 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.213 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.238 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.243 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.340 226298 DEBUG nova.policy [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.514 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.272s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.578 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] resizing rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.725 226298 DEBUG nova.objects.instance [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'migration_context' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.745 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.745 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Ensure instance console log exists: /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.746 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.747 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:04 np0005604791 nova_compute[226294]: 2026-02-02 10:11:04.747 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:05 np0005604791 nova_compute[226294]: 2026-02-02 10:11:05.278 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Successfully created port: 29f94a0b-58b9-437a-9157-c3ce95454def _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Feb  2 05:11:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:05.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:05 np0005604791 podman[235474]: 2026-02-02 10:11:05.431852292 +0000 UTC m=+0.099992626 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:11:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:05.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:05.998 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Successfully updated port: 29f94a0b-58b9-437a-9157-c3ce95454def _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.010 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.010 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.011 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.117 226298 DEBUG nova.compute.manager [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.117 226298 DEBUG nova.compute.manager [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.118 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.257 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Feb  2 05:11:06 np0005604791 nova_compute[226294]: 2026-02-02 10:11:06.309 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.081 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.108 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.108 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance network_info: |[{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.109 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.109 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.114 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start _get_guest_xml network_info=[{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': 'd5e062d7-95ef-409c-9ad0-60f7cf6f44ce'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.120 226298 WARNING nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.126 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.127 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.139 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.140 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.141 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.142 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-02T10:01:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1194feb9-e285-414e-825a-1e77171d092f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.143 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.143 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.144 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.145 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.145 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.146 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.146 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.147 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.148 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.148 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.154 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:07.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:07.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.441992) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067442024, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1443, "num_deletes": 255, "total_data_size": 3522886, "memory_usage": 3616896, "flush_reason": "Manual Compaction"}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067462832, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2262069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28409, "largest_seqno": 29847, "table_properties": {"data_size": 2256038, "index_size": 3230, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 12928, "raw_average_key_size": 19, "raw_value_size": 2243786, "raw_average_value_size": 3358, "num_data_blocks": 143, "num_entries": 668, "num_filter_entries": 668, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026962, "oldest_key_time": 1770026962, "file_creation_time": 1770027067, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 20921 microseconds, and 4325 cpu microseconds.
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.462907) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2262069 bytes OK
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.462936) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466057) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466083) EVENT_LOG_v1 {"time_micros": 1770027067466075, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466127) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3516070, prev total WAL file size 3516070, number of live WAL files 2.
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466954) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373532' seq:0, type:0; will stop at (end)
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2209KB)], [54(13MB)]
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067466978, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16260862, "oldest_snapshot_seqno": -1}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5989 keys, 16112660 bytes, temperature: kUnknown
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067594872, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16112660, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16070218, "index_size": 26396, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14981, "raw_key_size": 152759, "raw_average_key_size": 25, "raw_value_size": 15959887, "raw_average_value_size": 2664, "num_data_blocks": 1081, "num_entries": 5989, "num_filter_entries": 5989, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027067, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.595229) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16112660 bytes
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.596876) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.1 rd, 125.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 13.4 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(14.3) write-amplify(7.1) OK, records in: 6517, records dropped: 528 output_compression: NoCompression
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.596912) EVENT_LOG_v1 {"time_micros": 1770027067596892, "job": 32, "event": "compaction_finished", "compaction_time_micros": 127985, "compaction_time_cpu_micros": 21052, "output_level": 6, "num_output_files": 1, "total_output_size": 16112660, "num_input_records": 6517, "num_output_records": 5989, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067597351, "job": 32, "event": "table_file_deletion", "file_number": 56}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067599030, "job": 32, "event": "table_file_deletion", "file_number": 54}
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599169) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb  2 05:11:07 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2282015443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.680 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.687 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.711 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:07 np0005604791 nova_compute[226294]: 2026-02-02 10:11:07.715 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:08 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb  2 05:11:08 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/17390625' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.112 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.115 226298 DEBUG nova.virt.libvirt.vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:11:04Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.116 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.118 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.121 226298 DEBUG nova.objects.instance [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.145 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] End _get_guest_xml xml=<domain type="kvm">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <uuid>42dc4712-7770-4ecd-abba-8c8e970f8e46</uuid>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <name>instance-0000000b</name>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <memory>131072</memory>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <vcpu>1</vcpu>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <metadata>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:name>tempest-TestNetworkBasicOps-server-1589717047</nova:name>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:creationTime>2026-02-02 10:11:07</nova:creationTime>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:flavor name="m1.nano">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:memory>128</nova:memory>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:disk>1</nova:disk>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:swap>0</nova:swap>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:ephemeral>0</nova:ephemeral>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:vcpus>1</nova:vcpus>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </nova:flavor>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:owner>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </nova:owner>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <nova:ports>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <nova:port uuid="29f94a0b-58b9-437a-9157-c3ce95454def">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        </nova:port>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </nova:ports>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </nova:instance>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </metadata>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <sysinfo type="smbios">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <system>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="manufacturer">RDO</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="product">OpenStack Compute</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="serial">42dc4712-7770-4ecd-abba-8c8e970f8e46</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="uuid">42dc4712-7770-4ecd-abba-8c8e970f8e46</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <entry name="family">Virtual Machine</entry>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </system>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </sysinfo>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <os>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <type arch="x86_64" machine="q35">hvm</type>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <boot dev="hd"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <smbios mode="sysinfo"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </os>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <features>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <acpi/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <apic/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <vmcoreinfo/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </features>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <clock offset="utc">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <timer name="pit" tickpolicy="delay"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <timer name="rtc" tickpolicy="catchup"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <timer name="hpet" present="no"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </clock>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <cpu mode="host-model" match="exact">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <topology sockets="1" cores="1" threads="1"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </cpu>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  <devices>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <disk type="network" device="disk">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <driver type="raw" cache="none"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <source protocol="rbd" name="vms/42dc4712-7770-4ecd-abba-8c8e970f8e46_disk">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.100" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.102" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.101" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <auth username="openstack">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <target dev="vda" bus="virtio"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <disk type="network" device="cdrom">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <driver type="raw" cache="none"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <source protocol="rbd" name="vms/42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.100" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.102" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <host name="192.168.122.101" port="6789"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </source>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <auth username="openstack">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:        <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      </auth>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <target dev="sda" bus="sata"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </disk>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <interface type="ethernet">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <mac address="fa:16:3e:5f:0c:ce"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <model type="virtio"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <driver name="vhost" rx_queue_size="512"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <mtu size="1442"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <target dev="tap29f94a0b-58"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </interface>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <serial type="pty">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <log file="/var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/console.log" append="off"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </serial>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <video>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <model type="virtio"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </video>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <input type="tablet" bus="usb"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <rng model="virtio">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <backend model="random">/dev/urandom</backend>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </rng>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="pci" model="pcie-root-port"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <controller type="usb" index="0"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    <memballoon model="virtio">
Feb  2 05:11:08 np0005604791 nova_compute[226294]:      <stats period="10"/>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:    </memballoon>
Feb  2 05:11:08 np0005604791 nova_compute[226294]:  </devices>
Feb  2 05:11:08 np0005604791 nova_compute[226294]: </domain>
Feb  2 05:11:08 np0005604791 nova_compute[226294]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.146 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Preparing to wait for external event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.147 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.147 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.148 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.149 226298 DEBUG nova.virt.libvirt.vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:11:04Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.149 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.150 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.151 226298 DEBUG os_vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.152 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.152 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.153 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29f94a0b-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29f94a0b-58, col_values=(('external_ids', {'iface-id': '29f94a0b-58b9-437a-9157-c3ce95454def', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:0c:ce', 'vm-uuid': '42dc4712-7770-4ecd-abba-8c8e970f8e46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.158 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:08 np0005604791 NetworkManager[49055]: <info>  [1770027068.1595] manager: (tap29f94a0b-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.164 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.164 226298 INFO os_vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58')#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.225 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.225 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.226 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:5f:0c:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.227 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Using config drive#033[00m
Feb  2 05:11:08 np0005604791 nova_compute[226294]: 2026-02-02 10:11:08.262 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.004 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.005 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.024 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.152 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating config drive at /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.160 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyunjrs_n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.286 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyunjrs_n" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.325 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.329 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:09.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.513 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.514 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deleting local config drive /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config because it was imported into RBD.#033[00m
Feb  2 05:11:09 np0005604791 systemd[1]: Starting libvirt secret daemon...
Feb  2 05:11:09 np0005604791 systemd[1]: Started libvirt secret daemon.
Feb  2 05:11:09 np0005604791 kernel: tap29f94a0b-58: entered promiscuous mode
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.6166] manager: (tap29f94a0b-58): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb  2 05:11:09 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:09Z|00058|binding|INFO|Claiming lport 29f94a0b-58b9-437a-9157-c3ce95454def for this chassis.
Feb  2 05:11:09 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:09Z|00059|binding|INFO|29f94a0b-58b9-437a-9157-c3ce95454def: Claiming fa:16:3e:5f:0c:ce 10.100.0.6
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.617 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.621 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.627 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.632 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.641 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:0c:ce 10.100.0.6'], port_security=['fa:16:3e:5f:0c:ce 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42dc4712-7770-4ecd-abba-8c8e970f8e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8371570-b364-43fb-9d49-41b819ae5fa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7385ccf6-5875-4ca6-bbfb-418e49c25618, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=29f94a0b-58b9-437a-9157-c3ce95454def) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.643 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 29f94a0b-58b9-437a-9157-c3ce95454def in datapath 07b5f9e6-a53d-47d1-be8b-5269063b871d bound to our chassis#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.644 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 07b5f9e6-a53d-47d1-be8b-5269063b871d#033[00m
Feb  2 05:11:09 np0005604791 systemd-machined[195072]: New machine qemu-3-instance-0000000b.
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.656 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16437a34-54db-4d11-97b6-6bd19be6c91e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.657 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap07b5f9e6-a1 in ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.659 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap07b5f9e6-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.659 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2b74dae0-39f8-4d4d-89e6-401173eed0e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.660 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[1183bc3c-ad31-4aa3-9ea4-f98d79e2a7a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.672 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[d75f4cf4-9185-47de-9757-ea2bbeabe486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:09Z|00060|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def ovn-installed in OVS
Feb  2 05:11:09 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:09Z|00061|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def up in Southbound
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.677 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Feb  2 05:11:09 np0005604791 systemd-udevd[235660]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.700 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b37b91-c3e0-4330-81d6-1e97d34de4bf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.7061] device (tap29f94a0b-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.7074] device (tap29f94a0b-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.726 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb1c889-7952-4fe5-9370-4e9ad00bc5df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 systemd-udevd[235662]: Network interface NamePolicy= disabled on kernel command line.
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.732 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e4fa83cc-5fd1-4c61-b70b-25c166f31665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.7333] manager: (tap07b5f9e6-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.764 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[3db2d985-7ac7-4314-8aae-f5057c335b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.767 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5f9d47-83c3-41da-bf2d-a1d566e55c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.7849] device (tap07b5f9e6-a0): carrier: link connected
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.790 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9e79d4-4827-477d-8939-6bf1019e070d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.804 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f63e72ce-bb45-4977-97d5-24a4ee317936]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07b5f9e6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:18:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423209, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235690, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.817 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[511e9d43-4f90-4c02-a1c4-c03a73e0355a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:18c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423209, 'tstamp': 423209}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235691, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.834 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed1b81d-fc08-4895-bdbb-5d8cb59f4a0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07b5f9e6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:18:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423209, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235692, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.861 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a71186-027e-4c6e-a226-737d0c790707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.908 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[80866ac5-5fe7-45db-8a8f-95475bfa623e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.910 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07b5f9e6-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.910 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.911 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07b5f9e6-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.923 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 kernel: tap07b5f9e6-a0: entered promiscuous mode
Feb  2 05:11:09 np0005604791 NetworkManager[49055]: <info>  [1770027069.9242] manager: (tap07b5f9e6-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.927 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.928 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap07b5f9e6-a0, col_values=(('external_ids', {'iface-id': '1cafc178-edb3-4734-825c-ef4e45193789'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.930 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:09Z|00062|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.931 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.931 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Feb  2 05:11:09 np0005604791 nova_compute[226294]: 2026-02-02 10:11:09.935 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.935 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[800e92d3-2c99-4c14-a7b6-0a5f81645cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.937 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: global
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    log         /dev/log local0 debug
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    log-tag     haproxy-metadata-proxy-07b5f9e6-a53d-47d1-be8b-5269063b871d
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    user        root
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    group       root
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    maxconn     1024
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    pidfile     /var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    daemon
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: defaults
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    log global
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    mode http
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    option httplog
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    option dontlognull
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    option http-server-close
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    option forwardfor
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    retries                 3
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    timeout http-request    30s
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    timeout connect         30s
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    timeout client          32s
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    timeout server          32s
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    timeout http-keep-alive 30s
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: listen listener
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    bind 169.254.169.254:80
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    server metadata /var/lib/neutron/metadata_proxy
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]:    http-request add-header X-OVN-Network-ID 07b5f9e6-a53d-47d1-be8b-5269063b871d
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Feb  2 05:11:09 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.937 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'env', 'PROCESS_TAG=haproxy-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/07b5f9e6-a53d-47d1-be8b-5269063b871d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.104 226298 DEBUG nova.compute.manager [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.115 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.116 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.117 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.117 226298 DEBUG nova.compute.manager [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Processing event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.262 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.263 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.262436, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.264 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Started (Lifecycle Event)#033[00m
Feb  2 05:11:10 np0005604791 podman[235764]: 2026-02-02 10:11:10.26525355 +0000 UTC m=+0.050113932 container create 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.266 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.271 226298 INFO nova.virt.libvirt.driver [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance spawned successfully.#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.272 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.284 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.291 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.295 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.296 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.296 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.297 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.297 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.298 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Feb  2 05:11:10 np0005604791 systemd[1]: Started libpod-conmon-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope.
Feb  2 05:11:10 np0005604791 systemd[1]: Started libcrun container.
Feb  2 05:11:10 np0005604791 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26cfac83bec4483eeb9fd6487ef88b8a7ccc7477473882fbfbe7498fdcc8d7a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb  2 05:11:10 np0005604791 podman[235764]: 2026-02-02 10:11:10.236183378 +0000 UTC m=+0.021043790 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.334 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.334 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.2634013, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.335 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Paused (Lifecycle Event)#033[00m
Feb  2 05:11:10 np0005604791 podman[235764]: 2026-02-02 10:11:10.339988285 +0000 UTC m=+0.124848687 container init 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:11:10 np0005604791 podman[235764]: 2026-02-02 10:11:10.344400752 +0000 UTC m=+0.129261124 container start 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.360 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:11:10 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : New worker (235787) forked
Feb  2 05:11:10 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : Loading success.
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.370 226298 INFO nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 6.29 seconds to spawn the instance on the hypervisor.#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.371 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.372 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.266223, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.372 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Resumed (Lifecycle Event)#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.391 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.394 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.418 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.441 226298 INFO nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 7.32 seconds to build instance.#033[00m
Feb  2 05:11:10 np0005604791 nova_compute[226294]: 2026-02-02 10:11:10.460 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:11 np0005604791 nova_compute[226294]: 2026-02-02 10:11:11.340 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:11.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.193 226298 DEBUG nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.194 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.194 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 DEBUG nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:11:12 np0005604791 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 WARNING nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.#033[00m
Feb  2 05:11:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:13 np0005604791 nova_compute[226294]: 2026-02-02 10:11:13.159 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:13 np0005604791 podman[235798]: 2026-02-02 10:11:13.366269538 +0000 UTC m=+0.040938028 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:11:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:13.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:13.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.456 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:14 np0005604791 NetworkManager[49055]: <info>  [1770027074.4575] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb  2 05:11:14 np0005604791 NetworkManager[49055]: <info>  [1770027074.4587] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb  2 05:11:14 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:14Z|00063|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.474 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:14 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:14Z|00064|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.482 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.691 226298 DEBUG nova.compute.manager [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG nova.compute.manager [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.693 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:11:14 np0005604791 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.056 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:11:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:15.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:15.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.843 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.843 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.920 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.921 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.921 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Feb  2 05:11:15 np0005604791 nova_compute[226294]: 2026-02-02 10:11:15.922 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:11:16 np0005604791 nova_compute[226294]: 2026-02-02 10:11:16.343 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:17.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:17.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.644 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.687 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:11:17 np0005604791 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:18 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:11:18 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049014938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.305 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.439 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.439 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.651 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4771MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 42dc4712-7770-4ecd-abba-8c8e970f8e46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:11:18 np0005604791 nova_compute[226294]: 2026-02-02 10:11:18.854 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:11:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:11:19 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3378367145' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:11:19 np0005604791 nova_compute[226294]: 2026-02-02 10:11:19.278 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:11:19 np0005604791 nova_compute[226294]: 2026-02-02 10:11:19.282 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:11:19 np0005604791 nova_compute[226294]: 2026-02-02 10:11:19.299 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:11:19 np0005604791 nova_compute[226294]: 2026-02-02 10:11:19.321 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:11:19 np0005604791 nova_compute[226294]: 2026-02-02 10:11:19.322 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:19.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:20 np0005604791 nova_compute[226294]: 2026-02-02 10:11:20.092 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:11:21 np0005604791 nova_compute[226294]: 2026-02-02 10:11:21.345 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:21.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:21.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:22 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:22Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:0c:ce 10.100.0.6
Feb  2 05:11:22 np0005604791 ovn_controller[133666]: 2026-02-02T10:11:22Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:0c:ce 10.100.0.6
Feb  2 05:11:23 np0005604791 nova_compute[226294]: 2026-02-02 10:11:23.164 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:23.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:23.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:25.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:25.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:26 np0005604791 nova_compute[226294]: 2026-02-02 10:11:26.348 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:27.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:27.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:28 np0005604791 nova_compute[226294]: 2026-02-02 10:11:28.166 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:11:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:29.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:11:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:29.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:31 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/101131 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:11:31 np0005604791 nova_compute[226294]: 2026-02-02 10:11:31.350 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:31.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:33 np0005604791 nova_compute[226294]: 2026-02-02 10:11:33.167 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:33.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:35.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:35.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:36 np0005604791 nova_compute[226294]: 2026-02-02 10:11:36.353 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:36 np0005604791 podman[235900]: 2026-02-02 10:11:36.399667304 +0000 UTC m=+0.073938375 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb  2 05:11:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:37.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:37.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:38 np0005604791 nova_compute[226294]: 2026-02-02 10:11:38.169 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:39.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:39.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:41 np0005604791 nova_compute[226294]: 2026-02-02 10:11:41.354 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:41.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:41.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:43 np0005604791 nova_compute[226294]: 2026-02-02 10:11:43.171 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:43.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:43.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:43 np0005604791 podman[235956]: 2026-02-02 10:11:43.632882526 +0000 UTC m=+0.043942718 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb  2 05:11:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:11:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:45.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:11:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:46 np0005604791 nova_compute[226294]: 2026-02-02 10:11:46.356 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:47.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:48 np0005604791 nova_compute[226294]: 2026-02-02 10:11:48.217 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:49.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:49.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/101150 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb  2 05:11:50 np0005604791 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [ALERT] 032/101150 (4) : backend 'backend' has no server available!
Feb  2 05:11:51 np0005604791 nova_compute[226294]: 2026-02-02 10:11:51.357 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:51.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:53 np0005604791 nova_compute[226294]: 2026-02-02 10:11:53.219 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:53.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:53.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:11:55 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:11:55 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:11:55 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:11:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:55.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:55 np0005604791 nova_compute[226294]: 2026-02-02 10:11:55.479 226298 INFO nova.compute.manager [None req-55322302-f8bd-44d7-bf6e-df54484870c9 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output#033[00m
Feb  2 05:11:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:55.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:55 np0005604791 nova_compute[226294]: 2026-02-02 10:11:55.486 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Feb  2 05:11:56 np0005604791 nova_compute[226294]: 2026-02-02 10:11:56.359 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:11:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:11:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:57.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:57.480 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:11:57 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:11:57.481 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:11:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:57.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.781 226298 DEBUG nova.compute.manager [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.781 226298 DEBUG nova.compute.manager [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.782 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.782 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.783 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.848 226298 DEBUG nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:11:57 np0005604791 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 WARNING nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.#033[00m
Feb  2 05:11:58 np0005604791 nova_compute[226294]: 2026-02-02 10:11:58.221 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:11:58 np0005604791 nova_compute[226294]: 2026-02-02 10:11:58.673 226298 INFO nova.compute.manager [None req-e0c78638-0642-4dfe-93c7-9276d0f7149b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output#033[00m
Feb  2 05:11:58 np0005604791 nova_compute[226294]: 2026-02-02 10:11:58.679 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Feb  2 05:11:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:59.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:11:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:11:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:11:59 np0005604791 nova_compute[226294]: 2026-02-02 10:11:59.607 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:11:59 np0005604791 nova_compute[226294]: 2026-02-02 10:11:59.607 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:11:59 np0005604791 nova_compute[226294]: 2026-02-02 10:11:59.657 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.003 226298 DEBUG nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.005 226298 DEBUG nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.005 226298 WARNING nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.556 226298 DEBUG nova.compute.manager [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG nova.compute.manager [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.558 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.845 226298 INFO nova.compute.manager [None req-d72af9e5-dfb3-47c2-97db-8b229ea6a624 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output#033[00m
Feb  2 05:12:00 np0005604791 nova_compute[226294]: 2026-02-02 10:12:00.850 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Feb  2 05:12:01 np0005604791 nova_compute[226294]: 2026-02-02 10:12:01.401 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:01.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:12:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.098 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.098 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 WARNING nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.101 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.101 226298 WARNING nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.403 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.403 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:12:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:02 np0005604791 nova_compute[226294]: 2026-02-02 10:12:02.427 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:12:03 np0005604791 nova_compute[226294]: 2026-02-02 10:12:03.223 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:03 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:03.483 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:12:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:05.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.227 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.228 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.229 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.229 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.230 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.232 226298 INFO nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Terminating instance#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.234 226298 DEBUG nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Feb  2 05:12:06 np0005604791 kernel: tap29f94a0b-58 (unregistering): left promiscuous mode
Feb  2 05:12:06 np0005604791 NetworkManager[49055]: <info>  [1770027126.3116] device (tap29f94a0b-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb  2 05:12:06 np0005604791 ovn_controller[133666]: 2026-02-02T10:12:06Z|00065|binding|INFO|Releasing lport 29f94a0b-58b9-437a-9157-c3ce95454def from this chassis (sb_readonly=0)
Feb  2 05:12:06 np0005604791 ovn_controller[133666]: 2026-02-02T10:12:06Z|00066|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def down in Southbound
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.330 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 ovn_controller[133666]: 2026-02-02T10:12:06Z|00067|binding|INFO|Removing iface tap29f94a0b-58 ovn-installed in OVS
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.341 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.351 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:0c:ce 10.100.0.6'], port_security=['fa:16:3e:5f:0c:ce 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42dc4712-7770-4ecd-abba-8c8e970f8e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a8371570-b364-43fb-9d49-41b819ae5fa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7385ccf6-5875-4ca6-bbfb-418e49c25618, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=29f94a0b-58b9-437a-9157-c3ce95454def) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.354 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 29f94a0b-58b9-437a-9157-c3ce95454def in datapath 07b5f9e6-a53d-47d1-be8b-5269063b871d unbound from our chassis#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.356 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 07b5f9e6-a53d-47d1-be8b-5269063b871d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.358 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7dc1d2-20ec-4862-a957-780e73dd6586]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.359 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d namespace which is not needed anymore#033[00m
Feb  2 05:12:06 np0005604791 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb  2 05:12:06 np0005604791 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 14.328s CPU time.
Feb  2 05:12:06 np0005604791 systemd-machined[195072]: Machine qemu-3-instance-0000000b terminated.
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.399 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.467 226298 INFO nova.virt.libvirt.driver [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance destroyed successfully.#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.467 226298 DEBUG nova.objects.instance [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Feb  2 05:12:06 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : haproxy version is 2.8.14-c23fe91
Feb  2 05:12:06 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : path to executable is /usr/sbin/haproxy
Feb  2 05:12:06 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [WARNING]  (235785) : Exiting Master process...
Feb  2 05:12:06 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [ALERT]    (235785) : Current worker (235787) exited with code 143 (Terminated)
Feb  2 05:12:06 np0005604791 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [WARNING]  (235785) : All workers exited. Exiting... (0)
Feb  2 05:12:06 np0005604791 systemd[1]: libpod-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope: Deactivated successfully.
Feb  2 05:12:06 np0005604791 podman[236144]: 2026-02-02 10:12:06.514401991 +0000 UTC m=+0.054444277 container died 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb  2 05:12:06 np0005604791 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7-userdata-shm.mount: Deactivated successfully.
Feb  2 05:12:06 np0005604791 systemd[1]: var-lib-containers-storage-overlay-26cfac83bec4483eeb9fd6487ef88b8a7ccc7477473882fbfbe7498fdcc8d7a6-merged.mount: Deactivated successfully.
Feb  2 05:12:06 np0005604791 podman[236144]: 2026-02-02 10:12:06.662320649 +0000 UTC m=+0.202362925 container cleanup 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb  2 05:12:06 np0005604791 systemd[1]: libpod-conmon-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope: Deactivated successfully.
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.698 226298 DEBUG nova.virt.libvirt.vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:11:10Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.699 226298 DEBUG nova.network.os_vif_util [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.700 226298 DEBUG nova.network.os_vif_util [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.701 226298 DEBUG os_vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Feb  2 05:12:06 np0005604791 podman[236166]: 2026-02-02 10:12:06.703112612 +0000 UTC m=+0.194903267 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.704 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.704 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29f94a0b-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.708 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.716 226298 INFO os_vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58')#033[00m
Feb  2 05:12:06 np0005604791 podman[236205]: 2026-02-02 10:12:06.745417136 +0000 UTC m=+0.058800783 container remove 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.751 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[b1fc8833-f5ba-4c5f-9025-bc89b9e133d6]: (4, ('Mon Feb  2 10:12:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d (4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7)\n4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7\nMon Feb  2 10:12:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d (4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7)\n4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.754 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[039239ce-de94-4503-8815-1003ad9183b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.755 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07b5f9e6-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.757 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 kernel: tap07b5f9e6-a0: left promiscuous mode
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.767 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.770 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5350d28a-f5c5-4613-8adf-ec18c748967c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.788 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[bd53ca92-338e-4650-ac7e-5a80c05a9eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.790 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[29894cd1-9c6e-48f1-9839-a5d9a663313f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.806 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[db61f0a2-6f45-4e82-85d5-faa37f20be3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423203, 'reachable_time': 32173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236244, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.809 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Feb  2 05:12:06 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.809 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffa7964-ddc5-4f95-8552-553e1e956488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Feb  2 05:12:06 np0005604791 systemd[1]: run-netns-ovnmeta\x2d07b5f9e6\x2da53d\x2d47d1\x2dbe8b\x2d5269063b871d.mount: Deactivated successfully.
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.873 226298 DEBUG nova.compute.manager [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.873 226298 DEBUG nova.compute.manager [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Feb  2 05:12:06 np0005604791 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.176 226298 INFO nova.virt.libvirt.driver [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deleting instance files /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46_del#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.177 226298 INFO nova.virt.libvirt.driver [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deletion of /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46_del complete#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.229 226298 INFO nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.230 226298 DEBUG oslo.service.loopingcall [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.230 226298 DEBUG nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Feb  2 05:12:07 np0005604791 nova_compute[226294]: 2026-02-02 10:12:07.231 226298 DEBUG nova.network.neutron [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Feb  2 05:12:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:07.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:07.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.200 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.200 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.201 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.201 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.202 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.202 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.939 226298 DEBUG nova.network.neutron [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:12:08 np0005604791 nova_compute[226294]: 2026-02-02 10:12:08.973 226298 INFO nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 1.74 seconds to deallocate network for instance.#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.090 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.090 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.150 226298 DEBUG oslo_concurrency.processutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.192 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.193 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.214 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Feb  2 05:12:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:09.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:09 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:12:09 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3375339960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.600 226298 DEBUG oslo_concurrency.processutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.607 226298 DEBUG nova.compute.provider_tree [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.626 226298 DEBUG nova.scheduler.client.report [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.657 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.703 226298 INFO nova.scheduler.client.report [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 42dc4712-7770-4ecd-abba-8c8e970f8e46#033[00m
Feb  2 05:12:09 np0005604791 nova_compute[226294]: 2026-02-02 10:12:09.804 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.341 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.341 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 WARNING nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state deleted and task_state None.#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-deleted-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.343 226298 INFO nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Neutron deleted interface 29f94a0b-58b9-437a-9157-c3ce95454def; detaching it from the instance and deleting it from the info cache#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.343 226298 DEBUG nova.network.neutron [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Feb  2 05:12:10 np0005604791 nova_compute[226294]: 2026-02-02 10:12:10.346 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Detach interface failed, port_id=29f94a0b-58b9-437a-9157-c3ce95454def, reason: Instance 42dc4712-7770-4ecd-abba-8c8e970f8e46 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Feb  2 05:12:11 np0005604791 nova_compute[226294]: 2026-02-02 10:12:11.400 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:11.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:11.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:11 np0005604791 nova_compute[226294]: 2026-02-02 10:12:11.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:13.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:13.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:13 np0005604791 nova_compute[226294]: 2026-02-02 10:12:13.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:14 np0005604791 podman[236272]: 2026-02-02 10:12:14.391241507 +0000 UTC m=+0.066274311 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:14 np0005604791 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:15.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:15.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:15 np0005604791 nova_compute[226294]: 2026-02-02 10:12:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:16 np0005604791 nova_compute[226294]: 2026-02-02 10:12:16.403 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:16 np0005604791 nova_compute[226294]: 2026-02-02 10:12:16.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:16 np0005604791 nova_compute[226294]: 2026-02-02 10:12:16.709 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:16 np0005604791 nova_compute[226294]: 2026-02-02 10:12:16.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:16 np0005604791 nova_compute[226294]: 2026-02-02 10:12:16.755 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:17.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:17.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.717 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:12:17 np0005604791 nova_compute[226294]: 2026-02-02 10:12:17.719 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:12:18 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:12:18 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1372985538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.213 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.388 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.389 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4911MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.390 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.390 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.471 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.472 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.507 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:12:18 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:12:18 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4098705518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.971 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.977 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:12:18 np0005604791 nova_compute[226294]: 2026-02-02 10:12:18.992 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:12:19 np0005604791 nova_compute[226294]: 2026-02-02 10:12:19.021 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:12:19 np0005604791 nova_compute[226294]: 2026-02-02 10:12:19.021 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:19.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:19.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.023 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.404 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.465 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770027126.4637406, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.465 226298 INFO nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Stopped (Lifecycle Event)#033[00m
Feb  2 05:12:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:21.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.498 226298 DEBUG nova.compute.manager [None req-8bcc4e35-da37-46aa-9f06-793d17be0da9 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Feb  2 05:12:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:21 np0005604791 nova_compute[226294]: 2026-02-02 10:12:21.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:23.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:23.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:25.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:25.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:26 np0005604791 nova_compute[226294]: 2026-02-02 10:12:26.449 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:26 np0005604791 nova_compute[226294]: 2026-02-02 10:12:26.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:27.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:27.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:29.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:29.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:31 np0005604791 nova_compute[226294]: 2026-02-02 10:12:31.452 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:31.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:31 np0005604791 nova_compute[226294]: 2026-02-02 10:12:31.715 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:33.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:35.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:35.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:36 np0005604791 nova_compute[226294]: 2026-02-02 10:12:36.454 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:36 np0005604791 nova_compute[226294]: 2026-02-02 10:12:36.717 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.886176) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156886237, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1403, "num_deletes": 501, "total_data_size": 2530714, "memory_usage": 2581072, "flush_reason": "Manual Compaction"}
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156900373, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1165947, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29852, "largest_seqno": 31250, "table_properties": {"data_size": 1161140, "index_size": 1755, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15357, "raw_average_key_size": 19, "raw_value_size": 1148938, "raw_average_value_size": 1467, "num_data_blocks": 77, "num_entries": 783, "num_filter_entries": 783, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027068, "oldest_key_time": 1770027068, "file_creation_time": 1770027156, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14229 microseconds, and 2708 cpu microseconds.
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.900415) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1165947 bytes OK
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.900432) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904572) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904592) EVENT_LOG_v1 {"time_micros": 1770027156904587, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904609) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 2523116, prev total WAL file size 2523116, number of live WAL files 2.
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.905374) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1138KB)], [57(15MB)]
Feb  2 05:12:36 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156905450, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 17278607, "oldest_snapshot_seqno": -1}
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5786 keys, 11645443 bytes, temperature: kUnknown
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157034421, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 11645443, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11609426, "index_size": 20419, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 149661, "raw_average_key_size": 25, "raw_value_size": 11507706, "raw_average_value_size": 1988, "num_data_blocks": 818, "num_entries": 5786, "num_filter_entries": 5786, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027156, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.034622) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 11645443 bytes
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.049783) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.9 rd, 90.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 15.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(24.8) write-amplify(10.0) OK, records in: 6772, records dropped: 986 output_compression: NoCompression
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.049824) EVENT_LOG_v1 {"time_micros": 1770027157049809, "job": 34, "event": "compaction_finished", "compaction_time_micros": 129021, "compaction_time_cpu_micros": 31575, "output_level": 6, "num_output_files": 1, "total_output_size": 11645443, "num_input_records": 6772, "num_output_records": 5786, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157050136, "job": 34, "event": "table_file_deletion", "file_number": 59}
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157051958, "job": 34, "event": "table_file_deletion", "file_number": 57}
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.905215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052040) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:12:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:37 np0005604791 podman[236376]: 2026-02-02 10:12:37.423191036 +0000 UTC m=+0.094178802 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb  2 05:12:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:37.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:37.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:39.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:39.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:41 np0005604791 nova_compute[226294]: 2026-02-02 10:12:41.498 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:41.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:41.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:41 np0005604791 nova_compute[226294]: 2026-02-02 10:12:41.719 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:43.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:43.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:12:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:12:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:12:45 np0005604791 podman[236432]: 2026-02-02 10:12:45.363987662 +0000 UTC m=+0.043691142 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb  2 05:12:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:45.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:45.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:46 np0005604791 nova_compute[226294]: 2026-02-02 10:12:46.501 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:46 np0005604791 nova_compute[226294]: 2026-02-02 10:12:46.721 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:47.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:49.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:49.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:12:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:51 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:51 np0005604791 nova_compute[226294]: 2026-02-02 10:12:51.563 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:51 np0005604791 nova_compute[226294]: 2026-02-02 10:12:51.723 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:52 np0005604791 ovn_controller[133666]: 2026-02-02T10:12:52Z|00068|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb  2 05:12:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:12:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:53 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:12:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:56 np0005604791 nova_compute[226294]: 2026-02-02 10:12:56.587 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:56 np0005604791 nova_compute[226294]: 2026-02-02 10:12:56.725 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:12:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:12:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:12:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:12:57 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:57.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:12:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:12:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:12:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:12:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:59.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:12:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:12:59 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:59.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:01.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:01 np0005604791 nova_compute[226294]: 2026-02-02 10:13:01.589 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:01 np0005604791 nova_compute[226294]: 2026-02-02 10:13:01.726 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:13:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:03.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:05.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:05.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:05 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:05.808 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:13:05 np0005604791 nova_compute[226294]: 2026-02-02 10:13:05.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:05 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:05.809 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:13:06 np0005604791 nova_compute[226294]: 2026-02-02 10:13:06.592 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:06 np0005604791 nova_compute[226294]: 2026-02-02 10:13:06.728 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:07.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:07 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:13:08 np0005604791 podman[236664]: 2026-02-02 10:13:08.443686691 +0000 UTC m=+0.117851341 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb  2 05:13:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:09.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:09.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:09 np0005604791 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb  2 05:13:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:11.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:11 np0005604791 nova_compute[226294]: 2026-02-02 10:13:11.595 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:11 np0005604791 nova_compute[226294]: 2026-02-02 10:13:11.729 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:13.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.669 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.669 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:14 np0005604791 nova_compute[226294]: 2026-02-02 10:13:14.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:15.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:15.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:15 np0005604791 nova_compute[226294]: 2026-02-02 10:13:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:15 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:15.811 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:13:16 np0005604791 podman[236695]: 2026-02-02 10:13:16.363446976 +0000 UTC m=+0.039765167 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.596 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.643 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Feb  2 05:13:16 np0005604791 nova_compute[226294]: 2026-02-02 10:13:16.731 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:17.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:13:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:17.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:13:17 np0005604791 nova_compute[226294]: 2026-02-02 10:13:17.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:17 np0005604791 nova_compute[226294]: 2026-02-02 10:13:17.668 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:13:18 np0005604791 nova_compute[226294]: 2026-02-02 10:13:18.675 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:13:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:13:19 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2299787801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.143 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.285 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4943MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.584 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.584 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:13:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:19.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:19.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:19 np0005604791 nova_compute[226294]: 2026-02-02 10:13:19.651 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:13:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:13:20 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2021419824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.093 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.100 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.116 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.119 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.119 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:20 np0005604791 nova_compute[226294]: 2026-02-02 10:13:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:21.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:21 np0005604791 nova_compute[226294]: 2026-02-02 10:13:21.599 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:21 np0005604791 nova_compute[226294]: 2026-02-02 10:13:21.733 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:23.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:23.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:13:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:25.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:13:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:25.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:26 np0005604791 nova_compute[226294]: 2026-02-02 10:13:26.638 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:26 np0005604791 nova_compute[226294]: 2026-02-02 10:13:26.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:27.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:27.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:28 np0005604791 nova_compute[226294]: 2026-02-02 10:13:28.661 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:13:28 np0005604791 nova_compute[226294]: 2026-02-02 10:13:28.662 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Feb  2 05:13:28 np0005604791 nova_compute[226294]: 2026-02-02 10:13:28.685 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Feb  2 05:13:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:29.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:29.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:31.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:31.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:31 np0005604791 nova_compute[226294]: 2026-02-02 10:13:31.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:31 np0005604791 nova_compute[226294]: 2026-02-02 10:13:31.737 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:13:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:33.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:13:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:33.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:13:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:35.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:13:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:35 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:35.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:36 np0005604791 nova_compute[226294]: 2026-02-02 10:13:36.669 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:36 np0005604791 nova_compute[226294]: 2026-02-02 10:13:36.738 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:37 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:37.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:37.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:39 np0005604791 podman[236797]: 2026-02-02 10:13:39.384803414 +0000 UTC m=+0.064765761 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Feb  2 05:13:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:39 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:39.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:39.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:41 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:41 np0005604791 nova_compute[226294]: 2026-02-02 10:13:41.672 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:41 np0005604791 nova_compute[226294]: 2026-02-02 10:13:41.740 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:43 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:13:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:13:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:13:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb  2 05:13:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:45.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:45 np0005604791 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:45.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:46 np0005604791 nova_compute[226294]: 2026-02-02 10:13:46.674 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:46 np0005604791 nova_compute[226294]: 2026-02-02 10:13:46.742 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:47 np0005604791 podman[236854]: 2026-02-02 10:13:47.384042342 +0000 UTC m=+0.066447466 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Feb  2 05:13:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:47.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:47.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:49.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:49.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:51.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:51.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:51 np0005604791 nova_compute[226294]: 2026-02-02 10:13:51.676 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:51 np0005604791 nova_compute[226294]: 2026-02-02 10:13:51.744 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:53.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:53.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:55.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:55.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:56 np0005604791 nova_compute[226294]: 2026-02-02 10:13:56.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:56 np0005604791 nova_compute[226294]: 2026-02-02 10:13:56.746 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:13:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:13:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:57.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:13:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:57.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:13:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:13:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:59.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:13:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:13:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:13:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:59.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:01.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:01.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:01 np0005604791 nova_compute[226294]: 2026-02-02 10:14:01.716 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:01 np0005604791 nova_compute[226294]: 2026-02-02 10:14:01.747 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:03.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:03.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:05.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:05.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.749 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.751 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.751 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.752 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.758 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:06 np0005604791 nova_compute[226294]: 2026-02-02 10:14:06.759 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:07 np0005604791 systemd-logind[805]: New session 55 of user zuul.
Feb  2 05:14:07 np0005604791 systemd[1]: Started Session 55 of User zuul.
Feb  2 05:14:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:07.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:07.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:07 np0005604791 podman[237036]: 2026-02-02 10:14:07.744331067 +0000 UTC m=+0.074058757 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Feb  2 05:14:07 np0005604791 podman[237036]: 2026-02-02 10:14:07.840073913 +0000 UTC m=+0.169801583 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True)
Feb  2 05:14:08 np0005604791 podman[237254]: 2026-02-02 10:14:08.966587122 +0000 UTC m=+0.048392973 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 05:14:09 np0005604791 podman[237254]: 2026-02-02 10:14:09.00251041 +0000 UTC m=+0.084316261 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb  2 05:14:09 np0005604791 podman[237400]: 2026-02-02 10:14:09.381908007 +0000 UTC m=+0.060966328 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 05:14:09 np0005604791 podman[237400]: 2026-02-02 10:14:09.59147437 +0000 UTC m=+0.270532711 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb  2 05:14:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:09.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:09.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:09 np0005604791 podman[237451]: 2026-02-02 10:14:09.740102377 +0000 UTC m=+0.103036251 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb  2 05:14:09 np0005604791 podman[237533]: 2026-02-02 10:14:09.842212652 +0000 UTC m=+0.061499982 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, release=1793, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20)
Feb  2 05:14:09 np0005604791 podman[237533]: 2026-02-02 10:14:09.85150335 +0000 UTC m=+0.070790730 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:10 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:14:11 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb  2 05:14:11 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049693552' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb  2 05:14:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:11.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:11.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:11 np0005604791 nova_compute[226294]: 2026-02-02 10:14:11.758 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:11 np0005604791 ceph-mon[80115]: Health check failed: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb  2 05:14:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:13.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:14:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:13.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:14:14 np0005604791 nova_compute[226294]: 2026-02-02 10:14:14.673 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:15 np0005604791 nova_compute[226294]: 2026-02-02 10:14:15.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:15.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:15.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:15 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.670 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.760 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.799 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:16 np0005604791 nova_compute[226294]: 2026-02-02 10:14:16.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:17 np0005604791 ovs-vsctl[237895]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb  2 05:14:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:17 np0005604791 nova_compute[226294]: 2026-02-02 10:14:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:17 np0005604791 nova_compute[226294]: 2026-02-02 10:14:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:17.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:17.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:17 np0005604791 nova_compute[226294]: 2026-02-02 10:14:17.687 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:17 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb  2 05:14:17 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb  2 05:14:17 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb  2 05:14:18 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:14:18 np0005604791 podman[238108]: 2026-02-02 10:14:18.380792829 +0000 UTC m=+0.078521576 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Feb  2 05:14:18 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: cache status {prefix=cache status} (starting...)
Feb  2 05:14:18 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:18 np0005604791 lvm[238236]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 05:14:18 np0005604791 lvm[238236]: VG ceph_vg0 finished
Feb  2 05:14:18 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: client ls {prefix=client ls} (starting...)
Feb  2 05:14:18 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: damage ls {prefix=damage ls} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump loads {prefix=dump loads} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Feb  2 05:14:19 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2812603984' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:19.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:19.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:19 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb  2 05:14:19 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3436703442' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:14:19 np0005604791 nova_compute[226294]: 2026-02-02 10:14:19.792 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb  2 05:14:19 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1326465420' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3782917356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.218 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.342 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.343 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4716MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.344 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.344 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: ops {prefix=ops} (starting...)
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.473 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.474 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.487 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.504 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.504 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3733392687' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.523 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.549 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Feb  2 05:14:20 np0005604791 nova_compute[226294]: 2026-02-02 10:14:20.578 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: session ls {prefix=session ls} (starting...)
Feb  2 05:14:20 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:14:20 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2046104399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.015 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.020 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.043 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.044 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.044 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3260189932' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb  2 05:14:21 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: status {prefix=status} (starting...)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3985708071' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2896364104' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb  2 05:14:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:21.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:21.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/507030427' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb  2 05:14:21 np0005604791 nova_compute[226294]: 2026-02-02 10:14:21.800 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb  2 05:14:21 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3286499931' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb  2 05:14:22 np0005604791 nova_compute[226294]: 2026-02-02 10:14:22.045 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2273732711' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3889457626' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb  2 05:14:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4267692337' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/300668717' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3225208465' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb  2 05:14:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:23.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:23.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb  2 05:14:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/734854259' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000176 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000229 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.010611 2 0.001010
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=41
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=41
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001634 2 0.000246
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000021 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 5128192 heap: 87072768 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.976458 2 0.000193
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.989453 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 0.993056 6 0.000648
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010420 4 0.000754
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.007917 3 0.000146
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000096 1 0.000039
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.045353 1 0.000121
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.034063 1 0.000042
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.087581 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 1.081170 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000104 1 0.000164
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000056 1 0.000059
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=24
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=24
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002091 3 0.000044
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000027 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 6103040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 122 handle_osd_map epochs [122,123], i have 123, src has [1,123]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.391129 2 0.000227
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.393440 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.044078 3 0.000225
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000039 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 6094848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 123 heartbeat osd_stat(store_statfs(0x4fcaa5000/0x0/0x4ffc00000, data 0xd2086/0x173000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 6070272 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.140940666s of 11.563747406s, submitted: 64
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b(unlocked)] enter Initial
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=0 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000118 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=0 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000037 1 0.000070
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000146 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000322 1 0.000323
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000077 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000493 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883946 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 6053888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.112428 2 0.000315
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.113085 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.113293 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000136 1 0.000191
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000008 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] exit Started/Stray 1.121016 5 0.000074
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.013432 4 0.000192
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000093 1 0.000045
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.016881 1 0.000101
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 6029312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 126 heartbeat osd_stat(store_statfs(0x4fcaa1000/0x0/0x4ffc00000, data 0xd6146/0x179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.903217 1 0.000050
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.933791 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 2.054894 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000272 1 0.000335
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000066
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=15
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=15
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001882 3 0.000063
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 6012928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 127 ms_handle_reset con 0x5616e226d800 session 0x5616e275dc20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 127 ms_handle_reset con 0x5616dff2d400 session 0x5616e1ec8780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.034532 2 0.000088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.036567 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [127,128], i have 128, src has [1,128]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006295 3 0.000262
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000025 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 128 heartbeat osd_stat(store_statfs(0x4fca96000/0x0/0x4ffc00000, data 0xdc36c/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 5988352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 900489 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 5980160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [129,130], i have 128, src has [1,130]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 5971968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 5971968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 130 handle_osd_map epochs [131,132], i have 130, src has [1,132]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 5906432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 132 heartbeat osd_stat(store_statfs(0x4fca8c000/0x0/0x4ffc00000, data 0xe4254/0x18f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e(unlocked)] enter Initial
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=0 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000063 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=0 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000018 1 0.000034
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000121 1 0.000055
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000028 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000169 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 5898240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 916045 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.393966675s of 10.596799850s, submitted: 70
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 133 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.021068 2 0.000058
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.021285 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.021327 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000183 1 0.000267
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000038 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 5898240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f(unlocked)] enter Initial
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=0 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000060 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=0 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000029
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.001011 1 0.000044
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000032 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.001067 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.019780 5 0.000135
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.014964 4 0.000157
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000134 1 0.000096
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.047682 1 0.000062
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 135 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.357885 2 0.000069
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.358991 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.359019 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.294867 1 0.000041
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.357817 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 1.377689 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000112 1 0.000172
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000010 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000124 1 0.000218
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000026 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000089 1 0.000143
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=29
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=29
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002163 3 0.000172
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca84000/0x0/0x4ffc00000, data 0xe8314/0x195000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 136 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.969000 2 0.000098
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.971374 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 0.971604 5 0.000515
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.003521 4 0.000519
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000019 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.006987 4 0.000211
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000083 1 0.000067
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.043071 1 0.000050
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 5988352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.978616 1 0.000080
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.028919 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 2.000586 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000075 1 0.000117
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000343 1 0.000060
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=31
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=31
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001450 3 0.000062
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 5931008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.065664 2 0.000105
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.067577 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.025509 4 0.000275
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000066 0 0.000000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949870 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 5922816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 5922816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.920713425s of 12.206723213s, submitted: 63
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 6037504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc handle_mgr_map Got map version 30
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950130 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 5849088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 5849088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 5808128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 5808128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 5791744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321d000 session 0x5616dfb63c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 5791744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 5775360 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 5758976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 5742592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 5742592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 5734400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 5726208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.616950989s of 39.631927490s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 5709824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949778 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 5709824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 5693440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951290 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 5693440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 5685248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 5677056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 5668864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 5668864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950699 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 5660672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 5660672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 5652480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 5652480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.212181091s of 17.223489761s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 5644288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 5644288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 5619712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 5619712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 5611520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 5611520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2617800 session 0x5616e1eb5c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3543400 session 0x5616e359af00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 5603328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 5603328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 5586944 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 5586944 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21acc00 session 0x5616e04dc1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4c00 session 0x5616dfb745a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 5570560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 5570560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 5562368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 5554176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.014419556s of 23.017801285s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 5554176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2273400 session 0x5616e34745a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3284800 session 0x5616e1ec9e00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950699 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 5529600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950831 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 5529600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.597830772s of 11.617882729s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 5513216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950963 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 5513216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 5505024 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 5505024 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82624512 unmapped: 5496832 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82632704 unmapped: 5488640 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950831 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82632704 unmapped: 5488640 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.640151978s of 10.649944305s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82657280 unmapped: 5464064 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951752 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951620 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 5406720 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226f400 session 0x5616e2d70b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2617800 session 0x5616e0572780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3284000 session 0x5616df7d8780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.430633545s of 41.440040588s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953132 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953264 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.190359116s of 11.200112343s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 954185 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 954053 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.691310883s of 11.704643250s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 5201920 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 5201920 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 5177344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 5169152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 5169152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82960384 unmapped: 5160960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82960384 unmapped: 5160960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82968576 unmapped: 5152768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82968576 unmapped: 5152768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 5136384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 5136384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 5128192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 5128192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 5103616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e2d714a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e1ec90e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 5103616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83042304 unmapped: 5079040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83042304 unmapped: 5079040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83058688 unmapped: 5062656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83058688 unmapped: 5062656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.223167419s of 99.225830078s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 5054464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 5054464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 5046272 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956486 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 3989504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 3989504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2a98800 session 0x5616e2d71a40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327dc00 session 0x5616e2d710e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956486 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84156416 unmapped: 3964928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.070820808s of 12.081949234s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83107840 unmapped: 5013504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 955895 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 5005312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 5005312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 955895 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83132416 unmapped: 4988928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83132416 unmapped: 4988928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 957407 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 4972544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 4972544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.191962242s of 13.212522507s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956816 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 4947968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 4947968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226b400 session 0x5616e2d70d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327c000 session 0x5616e1f7be00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83197952 unmapped: 4923392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83197952 unmapped: 4923392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83206144 unmapped: 4915200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83206144 unmapped: 4915200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 4898816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 4898816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.953041077s of 19.965154648s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83230720 unmapped: 4890624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83230720 unmapped: 4890624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83238912 unmapped: 4882432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956225 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83255296 unmapped: 4866048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83255296 unmapped: 4866048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959249 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83263488 unmapped: 4857856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83263488 unmapped: 4857856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83271680 unmapped: 4849664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 21.03 MB, 0.04 MB/s#012Interval WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83337216 unmapped: 4784128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83337216 unmapped: 4784128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958658 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 4775936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 4775936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.006532669s of 16.020818710s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2254800 session 0x5616e05723c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cad400 session 0x5616e2d71c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 4759552 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 4759552 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83369984 unmapped: 4751360 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83378176 unmapped: 4743168 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83378176 unmapped: 4743168 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 4734976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 4734976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.845773697s of 12.849323273s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 4710400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 4710400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960170 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 4702208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 4702208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959579 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83435520 unmapped: 4685824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83435520 unmapped: 4685824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 4677632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.056375504s of 12.069332123s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 4677632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 4661248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 4661248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 4644864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 4644864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 4628480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 4628480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83501056 unmapped: 4620288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83501056 unmapped: 4620288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83509248 unmapped: 4612096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83517440 unmapped: 4603904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83517440 unmapped: 4603904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83525632 unmapped: 4595712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83525632 unmapped: 4595712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2619c00 session 0x5616e26210e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e108a400 session 0x5616dfb752c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83542016 unmapped: 4579328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83542016 unmapped: 4579328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83566592 unmapped: 4554752 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83566592 unmapped: 4554752 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83574784 unmapped: 4546560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83574784 unmapped: 4546560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83582976 unmapped: 4538368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10eec00 session 0x5616e359a3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542800 session 0x5616e1eb41e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83582976 unmapped: 4538368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.694198608s of 35.783321381s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83591168 unmapped: 4530176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.844978333s of 10.003371239s, submitted: 44
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83632128 unmapped: 4489216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 4243456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959120 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959120 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.813514709s of 13.270147324s, submitted: 293
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327e400 session 0x5616e33d2b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542400 session 0x5616e34ea5a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 4136960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.845741272s of 28.848985672s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 4136960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 4128768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 4128768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034110069s of 10.046130180s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962012 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.128645897s of 12.140229225s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961421 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0540c00 session 0x5616e1f7af00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3543000 session 0x5616e1f7ba40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2270c00 session 0x5616e337fe00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.132347107s of 25.178052902s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961421 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961553 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.086726189s of 12.105925560s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962474 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962474 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.559317589s of 10.567891121s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321e000 session 0x5616e34dbc20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e0d17860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.825603485s of 16.828807831s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963854 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.089159966s of 12.096708298s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963263 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3283c00 session 0x5616e3750f00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cafc00 session 0x5616e3750960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542000 session 0x5616e311a000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226ac00 session 0x5616e2797860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.016975403s of 57.231136322s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964907 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966419 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.142086029s of 13.158586502s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e278a000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281c00 session 0x5616e312c000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226c400 session 0x5616e311fe00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9800 session 0x5616e311a3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.639591217s of 18.651557922s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.387768745s of 10.394624710s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964646 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964514 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21e4000 session 0x5616e311af00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281000 session 0x5616e312ef00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.559337616s of 33.630935669s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.959052086s of 15.965865135s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0541c00 session 0x5616e0d6e1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3222000 session 0x5616e311b0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193000 session 0x5616e312c960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 53.633445740s of 53.641864777s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.074842453s of 12.082664490s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966947 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321a800 session 0x5616e311a780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3220400 session 0x5616e34ea5a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.421203613s of 22.434373856s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968459 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.030465126s of 15.042689323s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327a000 session 0x5616e3750960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.958620071s of 10.963719368s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.377700806s of 13.986701965s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3282400 session 0x5616e311a3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.270744324s of 48.274307251s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971813 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.719709396s of 17.732963562s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21adc00 session 0x5616e311b860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2caf800 session 0x5616e337f2c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.156009674s of 34.160236359s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 8815 writes, 34K keys, 8815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 8815 writes, 1876 syncs, 4.70 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 746 writes, 1209 keys, 746 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 746 writes, 348 syncs, 2.14 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193c00 session 0x5616e311bc20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3285400 session 0x5616e311a780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb  2 05:14:24 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/933138151' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.711874962s of 15.720390320s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.441757202s of 16.448879242s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4800 session 0x5616e3688780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10efc00 session 0x5616e312ef00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.622116089s of 49.629035950s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 3768320 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 3612672 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread fragmentation_score=0.000029 took=0.000037s
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980213165s of 18.280017853s, submitted: 343
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21ac400 session 0x5616e3688000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.939346313s of 41.943122864s, submitted: 1
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973985 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.912214279s of 17.978521347s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 977619 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 94093312 unmapped: 3465216 heap: 97558528 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 141 ms_handle_reset con 0x5616e327c800 session 0x5616e1f7b0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85827584 unmapped: 20127744 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb65f000/0x0/0x4ffc00000, data 0x10f66e2/0x11ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85868544 unmapped: 20086784 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 142 ms_handle_reset con 0x5616e21e4c00 session 0x5616e37ce780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141871 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fae59000/0x0/0x4ffc00000, data 0x18fa8f2/0x19b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e108a800 session 0x5616e3688d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e3280c00 session 0x5616e36965a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.927474976s of 40.098861694s, submitted: 30
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147855 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148527 data_alloc: 218103808 data_used: 143360
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e226a400 session 0x5616e1eb4780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 145 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 146 ms_handle_reset con 0x5616e2cac000 session 0x5616e34770e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102490112 unmapped: 3465216 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 3440640 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212945 data_alloc: 234881024 data_used: 13774848
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.954066277s of 12.084420204s, submitted: 24
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fad62000/0x0/0x4ffc00000, data 0x19eeaf0/0x1aa8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213508 data_alloc: 234881024 data_used: 13774848
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 2940928 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327c400 session 0x5616e34eb2c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222400 session 0x5616e3378780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.370294571s of 15.401467323s, submitted: 11
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 2965504 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106307584 unmapped: 2793472 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281178 data_alloc: 234881024 data_used: 14733312
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f944e000/0x0/0x4ffc00000, data 0x215bac2/0x2216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276638 data_alloc: 234881024 data_used: 14733312
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.930480957s of 10.239095688s, submitted: 95
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277223 data_alloc: 234881024 data_used: 14733312
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277072 data_alloc: 234881024 data_used: 14733312
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f942c000/0x0/0x4ffc00000, data 0x2185ac2/0x2240000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e1f04960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d800 session 0x5616e1f054a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e1f05e00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283800 session 0x5616e1f04780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 1802240 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.226655960s of 12.266713142s, submitted: 7
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321f000 session 0x5616e337e000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108953600 unmapped: 3293184 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e337f860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293665 data_alloc: 234881024 data_used: 15781888
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e0d161e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293681 data_alloc: 234881024 data_used: 15781888
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e1f7b2c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109043712 unmapped: 3203072 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034008980s of 10.189837456s, submitted: 37
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 3063808 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1294918 data_alloc: 234881024 data_used: 15740928
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1295070 data_alloc: 234881024 data_used: 15749120
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c800 session 0x5616e3750d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e36892c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108748800 unmapped: 3497984 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92c9000/0x0/0x4ffc00000, data 0x22e7b24/0x23a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.504937172s of 10.520147324s, submitted: 3
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 5611520 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341628 data_alloc: 234881024 data_used: 15806464
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109731840 unmapped: 4612096 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29abb24/0x2a67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e27985a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542c00 session 0x5616e2797860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352434 data_alloc: 234881024 data_used: 15826944
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10eec00 session 0x5616e311e1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542000 session 0x5616e2620000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.650221825s of 10.002939224s, submitted: 115
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617400 session 0x5616e337e3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286162 data_alloc: 234881024 data_used: 15585280
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9422000/0x0/0x4ffc00000, data 0x218eac2/0x2249000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286522 data_alloc: 234881024 data_used: 15585280
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475a40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e3751860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212351 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.967247963s of 12.068682671s, submitted: 26
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215375 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.472117424s of 15.489373207s, submitted: 5
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327e000 session 0x5616e2797c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e2d70d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220800 session 0x5616e312f4a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e337ef00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109068288 unmapped: 22077440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [1])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114139136 unmapped: 17006592 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118480896 unmapped: 12664832 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.837284088s of 18.897920609s, submitted: 9
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 12107776 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119062528 unmapped: 12083200 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f901d000/0x0/0x4ffc00000, data 0x2594ac2/0x264f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1397161 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1396570 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.991930008s of 12.083539009s, submitted: 22
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394635 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475e00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e0ffeb40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.675941467s of 17.711282730s, submitted: 2
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222c00 session 0x5616e3689a40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e34eaf00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ead20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e278ad20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 16252928 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e3408000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e36914a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.119201660s of 29.155227661s, submitted: 11
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99c00 session 0x5616e311b0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 20881408 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e10e7c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e10e63c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e0572780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e1f7a1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ad2/0x19be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4000 session 0x5616e0d17860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e37cef00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ea780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.900746346s of 15.997513771s, submitted: 18
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113655808 unmapped: 21168128 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367081 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.825149536s of 17.985601425s, submitted: 37
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366022 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e3696d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616df7d8780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e311f860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616dfb71c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e0d161e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e1f054a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e311a960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616e37ced20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e0d6ed20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114393088 unmapped: 20430848 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c38000/0x0/0x4ffc00000, data 0x2976b57/0x2a34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413608 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.185551643s of 10.350721359s, submitted: 46
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34db4a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417109 data_alloc: 234881024 data_used: 17113088
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114868224 unmapped: 19955712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 18014208 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.333662033s of 12.362901688s, submitted: 6
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 13238272 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 12673024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123871232 unmapped: 10952704 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f833b000/0x0/0x4ffc00000, data 0x3272b7a/0x3331000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1524459 data_alloc: 234881024 data_used: 21643264
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 10911744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3280000 session 0x5616e2cd8b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e311e960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123445248 unmapped: 11378688 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caf800 session 0x5616e2cd8780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 15237120 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e34061e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954436302s of 15.854330063s, submitted: 169
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e311a5a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e1f052c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.149909973s of 20.260728836s, submitted: 33
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c400 session 0x5616e04dd680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caec00 session 0x5616e2797860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e263c960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e359af00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e04dc1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250106 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.556289673s of 17.624111176s, submitted: 16
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255618 data_alloc: 234881024 data_used: 10326016
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 17833984 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b34000/0x0/0x4ffc00000, data 0x1a7cac2/0x1b37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d6000 session 0x5616e0f6e000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321a000 session 0x5616e337e5a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e2d70000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e34741e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.186046600s of 29.217700958s, submitted: 7
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258580 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e313e780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310128 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e3476780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3543400 session 0x5616e33d3e00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e33d3a40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.488653183s of 10.665133476s, submitted: 14
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e33d23c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315916 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 23068672 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.383611679s of 12.435736656s, submitted: 16
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 16957440 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122388480 unmapped: 16113664 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1437520 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122970112 unmapped: 15532032 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435392 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b90000/0x0/0x4ffc00000, data 0x2a1faf5/0x2adc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.516505241s of 13.009800911s, submitted: 89
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 16072704 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.440578461s of 19.455564499s, submitted: 4
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e312e960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e37ceb40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e34065a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.921215057s of 17.030107498s, submitted: 36
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e3406d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542400 session 0x5616e34772c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s#012Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.137660980s of 19.212322235s, submitted: 21
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e312fe00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282483 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34734a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10efc00 session 0x5616e34ea3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e2d70780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e1f05c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116129792 unmapped: 22372352 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116170752 unmapped: 22331392 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2255c00 session 0x5616e2d71680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e3476f00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312446 data_alloc: 234881024 data_used: 14000128
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.036809921s of 10.113059044s, submitted: 15
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3285000 session 0x5616e36881e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f1000/0x0/0x4ffc00000, data 0x1cafad2/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258361 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e33d2780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3494000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e3691c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e34ea1e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e337f0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e0fffc20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e05730e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e2cd83c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e05a2780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5400 session 0x5616dfb74b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e311b680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346877 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d84000/0x0/0x4ffc00000, data 0x241cad2/0x24d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e3476000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282c00 session 0x5616e34761e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321ec00 session 0x5616e359ab40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.106111526s of 13.270271301s, submitted: 38
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5800 session 0x5616e33d2f00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 29736960 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348691 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d83000/0x0/0x4ffc00000, data 0x241cae2/0x24d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3404780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e04dc960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cae400 session 0x5616e311a000
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.711370468s of 20.742525101s, submitted: 12
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327fc00 session 0x5616e312d680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298342 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541400 session 0x5616dfb705a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541000 session 0x5616dfb703c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f961c000/0x0/0x4ffc00000, data 0x1b85ac2/0x1c40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9800 session 0x5616e311b4a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.859712601s of 14.898717880s, submitted: 17
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e04dde00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301202 data_alloc: 234881024 data_used: 10452992
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116490240 unmapped: 29360128 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308194 data_alloc: 234881024 data_used: 11509760
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116695040 unmapped: 29155328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116867072 unmapped: 28983296 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116817920 unmapped: 29032448 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.749152184s of 11.151672363s, submitted: 349
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311098 data_alloc: 234881024 data_used: 11579392
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119087104 unmapped: 26763264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422902 data_alloc: 234881024 data_used: 11780096
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414566 data_alloc: 234881024 data_used: 11788288
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.005815506s of 12.341868401s, submitted: 120
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8903000/0x0/0x4ffc00000, data 0x289eac2/0x2959000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e04dd680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e3477680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616dfb634a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618800 session 0x5616e312d0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616e312fc20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e34725a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e312c780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 26427392 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.763683319s of 20.829357147s, submitted: 19
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e3477c20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9c00 session 0x5616e05732c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 26140672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1324070 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7000 session 0x5616e04dc960
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 25829376 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 26042368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330608 data_alloc: 234881024 data_used: 11010048
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356296 data_alloc: 234881024 data_used: 14831616
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 24354816 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.424659729s of 16.518987656s, submitted: 14
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3281400 session 0x5616e0d6ed20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123994112 unmapped: 21856256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453094 data_alloc: 234881024 data_used: 15106048
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124026880 unmapped: 21823488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87f4000/0x0/0x4ffc00000, data 0x29abac2/0x2a66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1459086 data_alloc: 234881024 data_used: 15007744
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x29b3ac2/0x2a6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e0cdb0e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e278a780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 22183936 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2616400 session 0x5616e36910e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123813888 unmapped: 22036480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.643545151s of 19.996114731s, submitted: 90
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124805120 unmapped: 21045248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1548442 data_alloc: 234881024 data_used: 16363520
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 21012480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d82000/0x0/0x4ffc00000, data 0x3417ac2/0x34d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 21381120 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cac000 session 0x5616e1f04b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542800 session 0x5616e1ec81e0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555186 data_alloc: 234881024 data_used: 16371712
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123117568 unmapped: 22732800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d01000/0x0/0x4ffc00000, data 0x34a0ac2/0x355b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555554 data_alloc: 234881024 data_used: 16371712
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.561103821s of 10.734168053s, submitted: 73
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555386 data_alloc: 234881024 data_used: 16371712
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555642 data_alloc: 234881024 data_used: 16371712
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.766269684s of 10.206396103s, submitted: 5
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e312c3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e34734a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321b000 session 0x5616e0cdb680
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e04dde00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f04780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439208 data_alloc: 234881024 data_used: 15007744
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616e311b860
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc ms_handle_reset ms_handle_reset con 0x5616dff2cc00
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1282799344
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: mgrc handle_mgr_configure stats_period=5
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7c00 session 0x5616e312cd20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226a400 session 0x5616e1f04d20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616dfb70b40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f7a780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.408138275s of 34.488780975s, submitted: 26
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e34eba40
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 24403968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f400 session 0x5616e312c780
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 24395776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.755592346s of 17.786962509s, submitted: 13
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,1])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 20889600 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441767 data_alloc: 234881024 data_used: 13156352
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a1a000/0x0/0x4ffc00000, data 0x2786ac2/0x2841000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434471 data_alloc: 234881024 data_used: 13156352
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a18000/0x0/0x4ffc00000, data 0x2789ac2/0x2844000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e312c3c0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c400 session 0x5616e312cd20
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.164107323s of 11.421627998s, submitted: 102
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 23257088 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e359a5a0
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 24305664 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:14:24 np0005604791 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb  2 05:14:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb  2 05:14:24 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3594690318' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb  2 05:14:24 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 05:14:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb  2 05:14:25 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/408226540' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb  2 05:14:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb  2 05:14:25 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/223053324' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb  2 05:14:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:25.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:25.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb  2 05:14:26 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3137409554' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb  2 05:14:26 np0005604791 nova_compute[226294]: 2026-02-02 10:14:26.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:26 np0005604791 nova_compute[226294]: 2026-02-02 10:14:26.803 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4278920637' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1157535423' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:27.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb  2 05:14:27 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4104614483' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2970138867' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb  2 05:14:28 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 05:14:28 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3330418646' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/482955478' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1038496448' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb  2 05:14:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3279308775' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1846634575' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3696275265' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2233140680' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb  2 05:14:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:14:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:29.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:14:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb  2 05:14:29 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2314626929' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2142204933' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3681611476' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb  2 05:14:30 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/757916292' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb  2 05:14:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:14:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:31.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:14:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:14:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:31.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.804 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:14:31 np0005604791 nova_compute[226294]: 2026-02-02 10:14:31.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Feb  2 05:14:32 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2796014380' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb  2 05:14:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb  2 05:14:32 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3216905718' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1519222063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb  2 05:14:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb  2 05:14:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb  2 05:14:34 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb  2 05:14:34 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3425981063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/499973376' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3657339990' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb  2 05:14:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:35.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb  2 05:14:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/923413145' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb  2 05:14:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Feb  2 05:14:36 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/610912574' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb  2 05:14:36 np0005604791 nova_compute[226294]: 2026-02-02 10:14:36.805 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:36 np0005604791 nova_compute[226294]: 2026-02-02 10:14:36.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Feb  2 05:14:37 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/891695340' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb  2 05:14:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:37.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:37.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:38 np0005604791 ovs-appctl[241966]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb  2 05:14:38 np0005604791 ovs-appctl[241972]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb  2 05:14:38 np0005604791 ovs-appctl[241982]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb  2 05:14:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Feb  2 05:14:39 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/996126505' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Feb  2 05:14:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:39.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:39.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Feb  2 05:14:39 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1974253353' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Feb  2 05:14:40 np0005604791 podman[242880]: 2026-02-02 10:14:40.389120722 +0000 UTC m=+0.066911557 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb  2 05:14:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Feb  2 05:14:41 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4142895348' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Feb  2 05:14:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:41.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:41.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:41 np0005604791 nova_compute[226294]: 2026-02-02 10:14:41.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:41 np0005604791 nova_compute[226294]: 2026-02-02 10:14:41.809 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Feb  2 05:14:41 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1782890941' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Feb  2 05:14:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2681020612' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2649633197' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Feb  2 05:14:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:43.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:43.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Feb  2 05:14:43 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/342004016' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Feb  2 05:14:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:14:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:14:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/472295889' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3468868603' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Feb  2 05:14:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:45.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:45.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Feb  2 05:14:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3233536469' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Feb  2 05:14:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Feb  2 05:14:46 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1057507925' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Feb  2 05:14:46 np0005604791 nova_compute[226294]: 2026-02-02 10:14:46.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:46 np0005604791 nova_compute[226294]: 2026-02-02 10:14:46.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Feb  2 05:14:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/603170672' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Feb  2 05:14:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Feb  2 05:14:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2864374764' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Feb  2 05:14:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:47.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Feb  2 05:14:48 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3240901568' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Feb  2 05:14:48 np0005604791 podman[243897]: 2026-02-02 10:14:48.459874682 +0000 UTC m=+0.048589707 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb  2 05:14:48 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb  2 05:14:49 np0005604791 systemd[1]: Starting Time & Date Service...
Feb  2 05:14:49 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Feb  2 05:14:49 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2824407657' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Feb  2 05:14:49 np0005604791 systemd[1]: Started Time & Date Service.
Feb  2 05:14:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:49.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Feb  2 05:14:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1527859638' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb  2 05:14:51 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Feb  2 05:14:51 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/218288169' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Feb  2 05:14:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:51.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:51.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:51 np0005604791 nova_compute[226294]: 2026-02-02 10:14:51.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:51 np0005604791 nova_compute[226294]: 2026-02-02 10:14:51.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Feb  2 05:14:52 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3301089356' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Feb  2 05:14:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Feb  2 05:14:52 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2848879131' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Feb  2 05:14:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:53.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:53.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:55.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:55.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:56 np0005604791 nova_compute[226294]: 2026-02-02 10:14:56.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:14:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:14:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:57.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:14:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:57.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:14:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:14:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:59.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:14:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:14:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:14:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:01.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:01 np0005604791 nova_compute[226294]: 2026-02-02 10:15:01.812 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:03.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:03.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:05.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:06 np0005604791 nova_compute[226294]: 2026-02-02 10:15:06.814 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:07.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:09.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:09.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:11 np0005604791 podman[244631]: 2026-02-02 10:15:11.447948966 +0000 UTC m=+0.125427379 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb  2 05:15:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:11.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:11.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:11 np0005604791 nova_compute[226294]: 2026-02-02 10:15:11.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:13.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:14.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:14 np0005604791 nova_compute[226294]: 2026-02-02 10:15:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:16.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.651 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.651 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.676 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:16 np0005604791 nova_compute[226294]: 2026-02-02 10:15:16.818 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:15:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:15:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:15:17 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:15:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:17.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:18.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:18 np0005604791 nova_compute[226294]: 2026-02-02 10:15:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:19 np0005604791 podman[244748]: 2026-02-02 10:15:19.417171886 +0000 UTC m=+0.081618620 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb  2 05:15:19 np0005604791 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb  2 05:15:19 np0005604791 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb  2 05:15:19 np0005604791 nova_compute[226294]: 2026-02-02 10:15:19.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:19.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:20.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.682 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.683 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.683 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.684 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.684 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:15:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:21.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:21 np0005604791 nova_compute[226294]: 2026-02-02 10:15:21.818 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:22.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/958922429' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.135 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.289 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.290 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4686MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.290 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.291 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.361 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.362 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.389 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:15:22 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/7819906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.863 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.867 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.882 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.883 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:15:22 np0005604791 nova_compute[226294]: 2026-02-02 10:15:22.884 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:15:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:23.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:23 np0005604791 nova_compute[226294]: 2026-02-02 10:15:23.884 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:15:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:24.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:25.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:26.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.821 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.823 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:26 np0005604791 nova_compute[226294]: 2026-02-02 10:15:26.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:27.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:28.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:29.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:31.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:31 np0005604791 nova_compute[226294]: 2026-02-02 10:15:31.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:32.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:33.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:34.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:35.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:36.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.827 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.828 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.828 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.829 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:36 np0005604791 nova_compute[226294]: 2026-02-02 10:15:36.879 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:37.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:38.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:39 np0005604791 systemd-logind[805]: Session 55 logged out. Waiting for processes to exit.
Feb  2 05:15:39 np0005604791 systemd[1]: session-55.scope: Deactivated successfully.
Feb  2 05:15:39 np0005604791 systemd[1]: session-55.scope: Consumed 2min 36.039s CPU time, 765.6M memory peak, read 304.5M from disk, written 211.4M to disk.
Feb  2 05:15:39 np0005604791 systemd-logind[805]: Removed session 55.
Feb  2 05:15:39 np0005604791 systemd-logind[805]: New session 56 of user zuul.
Feb  2 05:15:39 np0005604791 systemd[1]: Started Session 56 of User zuul.
Feb  2 05:15:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:39.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:39 np0005604791 systemd[1]: session-56.scope: Deactivated successfully.
Feb  2 05:15:39 np0005604791 systemd-logind[805]: Session 56 logged out. Waiting for processes to exit.
Feb  2 05:15:39 np0005604791 systemd-logind[805]: Removed session 56.
Feb  2 05:15:39 np0005604791 systemd-logind[805]: New session 57 of user zuul.
Feb  2 05:15:40 np0005604791 systemd[1]: Started Session 57 of User zuul.
Feb  2 05:15:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:40.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:40 np0005604791 systemd[1]: session-57.scope: Deactivated successfully.
Feb  2 05:15:40 np0005604791 systemd-logind[805]: Session 57 logged out. Waiting for processes to exit.
Feb  2 05:15:40 np0005604791 systemd-logind[805]: Removed session 57.
Feb  2 05:15:41 np0005604791 podman[244961]: 2026-02-02 10:15:41.706040898 +0000 UTC m=+0.087514487 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb  2 05:15:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:41.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.880 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.936 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:41 np0005604791 nova_compute[226294]: 2026-02-02 10:15:41.937 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:42.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:43.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:44.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.915 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:15:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.916 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:15:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.916 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:15:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:45.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.938 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.940 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.951 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:46 np0005604791 nova_compute[226294]: 2026-02-02 10:15:46.953 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:47.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:48.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:49.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:50.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:50 np0005604791 podman[244991]: 2026-02-02 10:15:50.395131681 +0000 UTC m=+0.064849083 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb  2 05:15:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:51.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:51 np0005604791 nova_compute[226294]: 2026-02-02 10:15:51.953 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:51 np0005604791 nova_compute[226294]: 2026-02-02 10:15:51.956 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:15:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:52.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.797206) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352797248, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2724, "num_deletes": 251, "total_data_size": 6598238, "memory_usage": 6685136, "flush_reason": "Manual Compaction"}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352846441, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4262073, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31255, "largest_seqno": 33974, "table_properties": {"data_size": 4250121, "index_size": 7486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 29962, "raw_average_key_size": 22, "raw_value_size": 4224550, "raw_average_value_size": 3138, "num_data_blocks": 319, "num_entries": 1346, "num_filter_entries": 1346, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027157, "oldest_key_time": 1770027157, "file_creation_time": 1770027352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 49278 microseconds, and 10765 cpu microseconds.
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.846487) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4262073 bytes OK
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.846503) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850197) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850246) EVENT_LOG_v1 {"time_micros": 1770027352850236, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850271) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6585171, prev total WAL file size 6585171, number of live WAL files 2.
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.851236) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4162KB)], [60(11MB)]
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352851273, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 15907516, "oldest_snapshot_seqno": -1}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6611 keys, 13905949 bytes, temperature: kUnknown
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352977006, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 13905949, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13862827, "index_size": 25474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 169843, "raw_average_key_size": 25, "raw_value_size": 13745104, "raw_average_value_size": 2079, "num_data_blocks": 1023, "num_entries": 6611, "num_filter_entries": 6611, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.977328) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 13905949 bytes
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.984192) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.4 rd, 110.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 11.1 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(7.0) write-amplify(3.3) OK, records in: 7132, records dropped: 521 output_compression: NoCompression
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.984227) EVENT_LOG_v1 {"time_micros": 1770027352984212, "job": 36, "event": "compaction_finished", "compaction_time_micros": 125820, "compaction_time_cpu_micros": 19360, "output_level": 6, "num_output_files": 1, "total_output_size": 13905949, "num_input_records": 7132, "num_output_records": 6611, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352985083, "job": 36, "event": "table_file_deletion", "file_number": 62}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352987114, "job": 36, "event": "table_file_deletion", "file_number": 60}
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.851095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:52 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:15:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:53.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:54.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:55.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:56.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:56 np0005604791 nova_compute[226294]: 2026-02-02 10:15:56.958 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:15:56 np0005604791 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:15:56 np0005604791 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:15:56 np0005604791 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:56 np0005604791 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:15:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:15:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:57.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:15:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:15:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:58.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:15:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:15:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:15:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:59.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:00.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:01.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:01 np0005604791 nova_compute[226294]: 2026-02-02 10:16:01.960 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:02.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:03.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:04.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:05.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:06.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:06 np0005604791 nova_compute[226294]: 2026-02-02 10:16:06.967 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:07.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:08.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:09.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:10.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:11.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:11 np0005604791 nova_compute[226294]: 2026-02-02 10:16:11.966 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:11 np0005604791 nova_compute[226294]: 2026-02-02 10:16:11.969 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:12.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:12 np0005604791 podman[245046]: 2026-02-02 10:16:12.409922515 +0000 UTC m=+0.082931555 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:16:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:14.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:15 np0005604791 nova_compute[226294]: 2026-02-02 10:16:15.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:15.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:16.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.651 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.991 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.992 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:16 np0005604791 nova_compute[226294]: 2026-02-02 10:16:16.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:17 np0005604791 nova_compute[226294]: 2026-02-02 10:16:17.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:17.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:18.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:18 np0005604791 nova_compute[226294]: 2026-02-02 10:16:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:18 np0005604791 nova_compute[226294]: 2026-02-02 10:16:18.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:16:18 np0005604791 nova_compute[226294]: 2026-02-02 10:16:18.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:16:18 np0005604791 nova_compute[226294]: 2026-02-02 10:16:18.681 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:16:18 np0005604791 nova_compute[226294]: 2026-02-02 10:16:18.681 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:19 np0005604791 nova_compute[226294]: 2026-02-02 10:16:19.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:19 np0005604791 nova_compute[226294]: 2026-02-02 10:16:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:19.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:20.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:21 np0005604791 podman[245078]: 2026-02-02 10:16:21.400395155 +0000 UTC m=+0.081523047 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb  2 05:16:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:21.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:21 np0005604791 nova_compute[226294]: 2026-02-02 10:16:21.996 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:21 np0005604791 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:21 np0005604791 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:21 np0005604791 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.033 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.034 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:22.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:16:22 np0005604791 nova_compute[226294]: 2026-02-02 10:16:22.678 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:16:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:16:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613712461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.124 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.305 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4858MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.396 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.397 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.417 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:16:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:23.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:23 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:16:23 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/216113752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.896 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.904 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.931 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.933 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:16:23 np0005604791 nova_compute[226294]: 2026-02-02 10:16:23.933 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:16:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:24.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:24 np0005604791 nova_compute[226294]: 2026-02-02 10:16:24.935 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:24 np0005604791 nova_compute[226294]: 2026-02-02 10:16:24.936 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:16:24 np0005604791 nova_compute[226294]: 2026-02-02 10:16:24.936 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:16:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:25.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:26.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:26 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.077 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:27 np0005604791 nova_compute[226294]: 2026-02-02 10:16:27.078 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:27.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:28.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:29.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:30.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:31 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:31 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:16:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:31.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:32 np0005604791 nova_compute[226294]: 2026-02-02 10:16:32.078 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:32 np0005604791 nova_compute[226294]: 2026-02-02 10:16:32.080 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:32.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:33.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:34.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:35.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:36.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.081 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.120 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:37 np0005604791 nova_compute[226294]: 2026-02-02 10:16:37.120 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:37.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:38.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:39.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:40.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:41.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5053 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.174 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:42 np0005604791 nova_compute[226294]: 2026-02-02 10:16:42.175 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:42.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:43 np0005604791 podman[245309]: 2026-02-02 10:16:43.419556217 +0000 UTC m=+0.097810962 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb  2 05:16:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:43.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:44.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:16:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:16:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:16:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:46.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:47 np0005604791 nova_compute[226294]: 2026-02-02 10:16:47.175 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:47.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:49.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.178 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:52.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.212 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:52 np0005604791 nova_compute[226294]: 2026-02-02 10:16:52.213 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:52 np0005604791 podman[245339]: 2026-02-02 10:16:52.385393418 +0000 UTC m=+0.062600542 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb  2 05:16:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:16:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:53.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:16:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:16:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:54.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:16:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:16:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:16:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:16:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:16:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:55.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.214 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.264 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:16:57 np0005604791 nova_compute[226294]: 2026-02-02 10:16:57.265 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:16:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:16:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:57.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:58.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:16:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:16:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:16:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:59.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:00.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:01.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:02.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.267 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.304 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:02 np0005604791 nova_compute[226294]: 2026-02-02 10:17:02.304 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:03.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:04.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:05.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:06.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.305 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.335 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:07 np0005604791 nova_compute[226294]: 2026-02-02 10:17:07.335 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:07.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:08.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:09.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:10.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:11.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:12.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.336 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:12 np0005604791 nova_compute[226294]: 2026-02-02 10:17:12.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:13.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:14.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:14 np0005604791 podman[245395]: 2026-02-02 10:17:14.444194848 +0000 UTC m=+0.118240397 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb  2 05:17:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:15.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:16.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:16 np0005604791 nova_compute[226294]: 2026-02-02 10:17:16.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.373 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.429 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.429 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:17 np0005604791 nova_compute[226294]: 2026-02-02 10:17:17.430 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:17.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:18.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.666 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:18 np0005604791 nova_compute[226294]: 2026-02-02 10:17:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:19 np0005604791 nova_compute[226294]: 2026-02-02 10:17:19.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:19.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:20.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:21 np0005604791 nova_compute[226294]: 2026-02-02 10:17:21.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:21.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:22.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.431 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:22 np0005604791 nova_compute[226294]: 2026-02-02 10:17:22.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:23 np0005604791 podman[245453]: 2026-02-02 10:17:23.36066748 +0000 UTC m=+0.041812267 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.680 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:17:23 np0005604791 nova_compute[226294]: 2026-02-02 10:17:23.681 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:17:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:17:24 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/557594777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.130 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:17:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:24.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.272 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.274 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4874MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.274 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.275 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.334 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.335 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.355 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:17:24 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:17:24 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3229599435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.804 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.810 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.830 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.833 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:17:24 np0005604791 nova_compute[226294]: 2026-02-02 10:17:24.834 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:17:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:25.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:26.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:26 np0005604791 nova_compute[226294]: 2026-02-02 10:17:26.836 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.476 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.478 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.515 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:27 np0005604791 nova_compute[226294]: 2026-02-02 10:17:27.515 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:28.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:29.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:30.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:32.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.516 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.560 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:32 np0005604791 nova_compute[226294]: 2026-02-02 10:17:32.561 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:17:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:17:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:17:32 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:17:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:33.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:34.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:35.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:36.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.563 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.614 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:37 np0005604791 nova_compute[226294]: 2026-02-02 10:17:37.615 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.646342) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457646388, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1565, "num_deletes": 505, "total_data_size": 3212587, "memory_usage": 3274512, "flush_reason": "Manual Compaction"}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457662079, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2072083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33979, "largest_seqno": 35539, "table_properties": {"data_size": 2065840, "index_size": 2998, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 15397, "raw_average_key_size": 17, "raw_value_size": 2051204, "raw_average_value_size": 2357, "num_data_blocks": 132, "num_entries": 870, "num_filter_entries": 870, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027353, "oldest_key_time": 1770027353, "file_creation_time": 1770027457, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 15844 microseconds, and 6053 cpu microseconds.
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.662141) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2072083 bytes OK
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.662209) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664047) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664073) EVENT_LOG_v1 {"time_micros": 1770027457664065, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664099) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3204307, prev total WAL file size 3204307, number of live WAL files 2.
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.665214) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323534' seq:72057594037927935, type:22 .. '6B7600353035' seq:0, type:0; will stop at (end)
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2023KB)], [63(13MB)]
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457665292, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 15978032, "oldest_snapshot_seqno": -1}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6456 keys, 14497144 bytes, temperature: kUnknown
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457820205, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14497144, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14453674, "index_size": 26208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16197, "raw_key_size": 169525, "raw_average_key_size": 26, "raw_value_size": 14337089, "raw_average_value_size": 2220, "num_data_blocks": 1037, "num_entries": 6456, "num_filter_entries": 6456, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027457, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.820535) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14497144 bytes
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.822262) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.1 rd, 93.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.3 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(14.7) write-amplify(7.0) OK, records in: 7481, records dropped: 1025 output_compression: NoCompression
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.822298) EVENT_LOG_v1 {"time_micros": 1770027457822282, "job": 38, "event": "compaction_finished", "compaction_time_micros": 155000, "compaction_time_cpu_micros": 40127, "output_level": 6, "num_output_files": 1, "total_output_size": 14497144, "num_input_records": 7481, "num_output_records": 6456, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457822775, "job": 38, "event": "table_file_deletion", "file_number": 65}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457825079, "job": 38, "event": "table_file_deletion", "file_number": 63}
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.665046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825191) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:17:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:37.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:38.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:39.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:40.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:41.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:42.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.616 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.649 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:42 np0005604791 nova_compute[226294]: 2026-02-02 10:17:42.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:43.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:44.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.918 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:17:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:17:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:17:45 np0005604791 podman[245659]: 2026-02-02 10:17:45.460818215 +0000 UTC m=+0.131302446 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb  2 05:17:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:45.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:46.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:47 np0005604791 nova_compute[226294]: 2026-02-02 10:17:47.651 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:47.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:48.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:49.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:50.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:51.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:52.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.694 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:52 np0005604791 nova_compute[226294]: 2026-02-02 10:17:52.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:17:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:53.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:54.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:54 np0005604791 podman[245691]: 2026-02-02 10:17:54.387037939 +0000 UTC m=+0.060651500 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:17:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:17:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:17:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:17:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:17:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:17:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:55.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:17:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:56.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:17:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:17:57 np0005604791 nova_compute[226294]: 2026-02-02 10:17:57.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:17:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:57.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:17:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:58.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:17:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:17:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:17:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:59.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000054s ======
Feb  2 05:18:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:00.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Feb  2 05:18:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:01.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:02.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.697 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.729 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:02 np0005604791 nova_compute[226294]: 2026-02-02 10:18:02.730 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:03.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:04.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:05.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:06.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:07 np0005604791 nova_compute[226294]: 2026-02-02 10:18:07.731 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:07.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:08.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:09.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:10.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:11.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:12.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.732 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.734 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:12 np0005604791 nova_compute[226294]: 2026-02-02 10:18:12.763 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:13.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:15.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:15.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:16 np0005604791 podman[245747]: 2026-02-02 10:18:16.476032142 +0000 UTC m=+0.135679743 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb  2 05:18:16 np0005604791 nova_compute[226294]: 2026-02-02 10:18:16.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:16 np0005604791 nova_compute[226294]: 2026-02-02 10:18:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Feb  2 05:18:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:17.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.764 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.765 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:17 np0005604791 nova_compute[226294]: 2026-02-02 10:18:17.769 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:17.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:18 np0005604791 nova_compute[226294]: 2026-02-02 10:18:18.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:18 np0005604791 nova_compute[226294]: 2026-02-02 10:18:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:19.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:19 np0005604791 nova_compute[226294]: 2026-02-02 10:18:19.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:19 np0005604791 nova_compute[226294]: 2026-02-02 10:18:19.672 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:19.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:20 np0005604791 nova_compute[226294]: 2026-02-02 10:18:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:20 np0005604791 nova_compute[226294]: 2026-02-02 10:18:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:18:20 np0005604791 nova_compute[226294]: 2026-02-02 10:18:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:18:20 np0005604791 nova_compute[226294]: 2026-02-02 10:18:20.663 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:18:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:21.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:21 np0005604791 nova_compute[226294]: 2026-02-02 10:18:21.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:21.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:22 np0005604791 nova_compute[226294]: 2026-02-02 10:18:22.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:22 np0005604791 nova_compute[226294]: 2026-02-02 10:18:22.765 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:22 np0005604791 nova_compute[226294]: 2026-02-02 10:18:22.770 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:23.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:23 np0005604791 nova_compute[226294]: 2026-02-02 10:18:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:23 np0005604791 nova_compute[226294]: 2026-02-02 10:18:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:18:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:24.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.689 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.689 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:18:24 np0005604791 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:18:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:25.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:25 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:18:25 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/356890111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.168 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.352 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.354 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4901MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.355 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:18:25 np0005604791 podman[245826]: 2026-02-02 10:18:25.37912583 +0000 UTC m=+0.055079129 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.574 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.575 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:18:25 np0005604791 nova_compute[226294]: 2026-02-02 10:18:25.606 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:18:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:26.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:18:26 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1611172185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.061 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.066 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.087 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.090 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.090 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:26 np0005604791 nova_compute[226294]: 2026-02-02 10:18:26.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:27.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:27 np0005604791 nova_compute[226294]: 2026-02-02 10:18:27.767 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:27 np0005604791 nova_compute[226294]: 2026-02-02 10:18:27.771 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:28.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:29.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:30.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:31.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:32.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.772 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.802 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:32 np0005604791 nova_compute[226294]: 2026-02-02 10:18:32.802 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:33.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:34.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:35.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:36.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:37 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:37 np0005604791 nova_compute[226294]: 2026-02-02 10:18:37.803 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:38.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:18:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:38 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:18:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:39.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:39 np0005604791 nova_compute[226294]: 2026-02-02 10:18:39.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:18:39 np0005604791 nova_compute[226294]: 2026-02-02 10:18:39.668 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Feb  2 05:18:39 np0005604791 nova_compute[226294]: 2026-02-02 10:18:39.704 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Feb  2 05:18:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:40.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:41.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:42.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.805 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:42 np0005604791 nova_compute[226294]: 2026-02-02 10:18:42.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:42 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:18:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:43.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:18:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:18:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:18:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:45.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:47.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:47 np0005604791 podman[246010]: 2026-02-02 10:18:47.434586136 +0000 UTC m=+0.107762053 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 05:18:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.810 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.810 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:47 np0005604791 nova_compute[226294]: 2026-02-02 10:18:47.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:18:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:18:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:48.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:18:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:49.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:50.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:51.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:52 np0005604791 nova_compute[226294]: 2026-02-02 10:18:52.841 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:53.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:54.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:18:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:18:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:18:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:18:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:56.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:56 np0005604791 podman[246041]: 2026-02-02 10:18:56.437530136 +0000 UTC m=+0.099669740 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:18:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:18:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:57.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:18:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:18:57 np0005604791 nova_compute[226294]: 2026-02-02 10:18:57.842 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:18:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:58.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:18:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:18:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:18:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:00.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:00 np0005604791 nova_compute[226294]: 2026-02-02 10:19:00.442 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:01.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:02.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.843 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.847 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:02 np0005604791 nova_compute[226294]: 2026-02-02 10:19:02.848 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:03.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:04.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:05.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:06.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:07 np0005604791 nova_compute[226294]: 2026-02-02 10:19:07.847 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:08.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:19:08 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 12K writes, 3443 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1534 writes, 5165 keys, 1534 commit groups, 1.0 writes per commit group, ingest: 4.83 MB, 0.01 MB/s#012Interval WAL: 1534 writes, 631 syncs, 2.43 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb  2 05:19:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:09.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:10.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:11.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:12.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:12 np0005604791 nova_compute[226294]: 2026-02-02 10:19:12.848 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:12 np0005604791 nova_compute[226294]: 2026-02-02 10:19:12.851 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:13.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:14.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:15.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:16.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:17.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.852 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.853 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:17 np0005604791 nova_compute[226294]: 2026-02-02 10:19:17.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:18.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:18 np0005604791 podman[246095]: 2026-02-02 10:19:18.414468434 +0000 UTC m=+0.089407288 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb  2 05:19:18 np0005604791 nova_compute[226294]: 2026-02-02 10:19:18.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:18 np0005604791 nova_compute[226294]: 2026-02-02 10:19:18.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:19.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:19 np0005604791 nova_compute[226294]: 2026-02-02 10:19:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:20.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:20 np0005604791 nova_compute[226294]: 2026-02-02 10:19:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:20 np0005604791 nova_compute[226294]: 2026-02-02 10:19:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:19:20 np0005604791 nova_compute[226294]: 2026-02-02 10:19:20.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:19:20 np0005604791 nova_compute[226294]: 2026-02-02 10:19:20.706 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:19:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:21.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:22.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:22 np0005604791 nova_compute[226294]: 2026-02-02 10:19:22.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:22 np0005604791 nova_compute[226294]: 2026-02-02 10:19:22.904 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:23 np0005604791 nova_compute[226294]: 2026-02-02 10:19:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:23 np0005604791 nova_compute[226294]: 2026-02-02 10:19:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:19:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:24.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:24 np0005604791 nova_compute[226294]: 2026-02-02 10:19:24.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:25.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.670 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:19:25 np0005604791 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:19:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:26.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2773930151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.145 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.284 226298 DEBUG oslo_concurrency.processutils [None req-c44df769-14b9-4ff4-8b94-fd29c4457052 41d09654a7d04d60a23411cf80fe1f98 823d3e7e313a44e9a50531e3fef22a1b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.304 226298 DEBUG oslo_concurrency.processutils [None req-c44df769-14b9-4ff4-8b94-fd29c4457052 41d09654a7d04d60a23411cf80fe1f98 823d3e7e313a44e9a50531e3fef22a1b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.353 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4902MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.470 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.471 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.487 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.587 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.588 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.607 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.636 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Feb  2 05:19:26 np0005604791 nova_compute[226294]: 2026-02-02 10:19:26.654 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.943076) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566943207, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1280, "num_deletes": 251, "total_data_size": 3159674, "memory_usage": 3210328, "flush_reason": "Manual Compaction"}
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566964058, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2057326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35544, "largest_seqno": 36819, "table_properties": {"data_size": 2051765, "index_size": 2956, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11972, "raw_average_key_size": 19, "raw_value_size": 2040593, "raw_average_value_size": 3395, "num_data_blocks": 130, "num_entries": 601, "num_filter_entries": 601, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027458, "oldest_key_time": 1770027458, "file_creation_time": 1770027566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 21075 microseconds, and 3790 cpu microseconds.
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.964129) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2057326 bytes OK
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.964188) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967189) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967207) EVENT_LOG_v1 {"time_micros": 1770027566967202, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967228) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 3153571, prev total WAL file size 3153571, number of live WAL files 2.
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967816) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2009KB)], [66(13MB)]
Feb  2 05:19:26 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566967865, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16554470, "oldest_snapshot_seqno": -1}
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6541 keys, 14507996 bytes, temperature: kUnknown
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567092125, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 14507996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14463888, "index_size": 26652, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 171985, "raw_average_key_size": 26, "raw_value_size": 14345423, "raw_average_value_size": 2193, "num_data_blocks": 1052, "num_entries": 6541, "num_filter_entries": 6541, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.092729) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 14507996 bytes
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.094261) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.8 rd, 116.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.8 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(15.1) write-amplify(7.1) OK, records in: 7057, records dropped: 516 output_compression: NoCompression
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.094291) EVENT_LOG_v1 {"time_micros": 1770027567094277, "job": 40, "event": "compaction_finished", "compaction_time_micros": 124658, "compaction_time_cpu_micros": 26506, "output_level": 6, "num_output_files": 1, "total_output_size": 14507996, "num_input_records": 7057, "num_output_records": 6541, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567095221, "job": 40, "event": "table_file_deletion", "file_number": 68}
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567097337, "job": 40, "event": "table_file_deletion", "file_number": 66}
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097491) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3504227893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:19:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:27.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.127 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.133 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.162 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.163 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.163 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:19:27 np0005604791 podman[246196]: 2026-02-02 10:19:27.382475472 +0000 UTC m=+0.055208952 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:19:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:27 np0005604791 nova_compute[226294]: 2026-02-02 10:19:27.906 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:28.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:28 np0005604791 nova_compute[226294]: 2026-02-02 10:19:28.164 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:19:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:29.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:30.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:31.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:31 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:31.890 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Feb  2 05:19:31 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:31.891 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Feb  2 05:19:31 np0005604791 nova_compute[226294]: 2026-02-02 10:19:31.891 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:32.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:32 np0005604791 nova_compute[226294]: 2026-02-02 10:19:32.955 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:33.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:34.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:35.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:19:35 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6906 writes, 36K keys, 6906 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6906 writes, 6906 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1550 writes, 8350 keys, 1550 commit groups, 1.0 writes per commit group, ingest: 17.96 MB, 0.03 MB/s#012Interval WAL: 1550 writes, 1550 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    105.3      0.52              0.13        20    0.026       0      0       0.0       0.0#012  L6      1/0   13.84 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.4    119.0    101.8      2.39              0.54        19    0.126    108K    10K       0.0       0.0#012 Sum      1/0   13.84 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.4     97.6    102.4      2.91              0.67        39    0.075    108K    10K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0     99.7    100.4      0.78              0.17        10    0.078     34K   3576       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    119.0    101.8      2.39              0.54        19    0.126    108K    10K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    105.7      0.52              0.13        19    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.054, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.29 GB write, 0.12 MB/s write, 0.28 GB read, 0.12 MB/s read, 2.9 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 26.29 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000284 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1582,25.45 MB,8.37284%) FilterBlock(39,320.67 KB,0.103012%) IndexBlock(39,533.52 KB,0.171385%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Feb  2 05:19:35 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:35.893 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Feb  2 05:19:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:36.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:37.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.957 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:37 np0005604791 nova_compute[226294]: 2026-02-02 10:19:37.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:38.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:39.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:40.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:41.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:42.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:42 np0005604791 nova_compute[226294]: 2026-02-02 10:19:42.996 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:42 np0005604791 nova_compute[226294]: 2026-02-02 10:19:42.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:42 np0005604791 nova_compute[226294]: 2026-02-02 10:19:42.999 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:42 np0005604791 nova_compute[226294]: 2026-02-02 10:19:42.999 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:43 np0005604791 nova_compute[226294]: 2026-02-02 10:19:43.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:43 np0005604791 nova_compute[226294]: 2026-02-02 10:19:43.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:44.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:44 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:19:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:19:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.920 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:19:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.920 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:19:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:46.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:47.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.036 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:48 np0005604791 nova_compute[226294]: 2026-02-02 10:19:48.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:48 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:19:48 np0005604791 podman[246429]: 2026-02-02 10:19:48.944069836 +0000 UTC m=+0.100845701 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:19:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:50.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:52.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:53 np0005604791 nova_compute[226294]: 2026-02-02 10:19:53.071 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:53.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:19:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:54.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:19:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:55.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:19:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:57.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.073 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:19:58 np0005604791 nova_compute[226294]: 2026-02-02 10:19:58.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:19:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:19:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:58.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:19:58 np0005604791 podman[246463]: 2026-02-02 10:19:58.392246533 +0000 UTC m=+0.063973594 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 05:19:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:19:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:19:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:59.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:00.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:00 np0005604791 ceph-mon[80115]: Health detail: HEALTH_WARN 2 failed cephadm daemon(s)
Feb  2 05:20:00 np0005604791 ceph-mon[80115]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Feb  2 05:20:00 np0005604791 ceph-mon[80115]:    daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1 is in error state
Feb  2 05:20:00 np0005604791 ceph-mon[80115]:    daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2 is in error state
Feb  2 05:20:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:01.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:02.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.125 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:03 np0005604791 nova_compute[226294]: 2026-02-02 10:20:03.126 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:03.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:04.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:05.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:06.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:07.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.158 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:08 np0005604791 nova_compute[226294]: 2026-02-02 10:20:08.159 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:08.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:09.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:10.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:11.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:12.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.160 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.161 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.196 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.197 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:13.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:13 np0005604791 nova_compute[226294]: 2026-02-02 10:20:13.197 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:14.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:15.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:16.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:18.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.198 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.253 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.254 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:18 np0005604791 nova_compute[226294]: 2026-02-02 10:20:18.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:19 np0005604791 podman[246519]: 2026-02-02 10:20:19.4598973 +0000 UTC m=+0.141137497 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb  2 05:20:19 np0005604791 nova_compute[226294]: 2026-02-02 10:20:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:20.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:21 np0005604791 nova_compute[226294]: 2026-02-02 10:20:21.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:21 np0005604791 nova_compute[226294]: 2026-02-02 10:20:21.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:20:21 np0005604791 nova_compute[226294]: 2026-02-02 10:20:21.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:20:21 np0005604791 nova_compute[226294]: 2026-02-02 10:20:21.674 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:20:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:22.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.255 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.256 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.259 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:23 np0005604791 nova_compute[226294]: 2026-02-02 10:20:23.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:24.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:24 np0005604791 nova_compute[226294]: 2026-02-02 10:20:24.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:25 np0005604791 nova_compute[226294]: 2026-02-02 10:20:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:25 np0005604791 nova_compute[226294]: 2026-02-02 10:20:25.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:20:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:26.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:26 np0005604791 nova_compute[226294]: 2026-02-02 10:20:26.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:27.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:20:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.696 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:20:27 np0005604791 nova_compute[226294]: 2026-02-02 10:20:27.698 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:20:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:20:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1092170194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.166 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:20:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:28.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.366 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4884MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.451 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.452 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.477 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:20:28 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:20:28 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2889706750' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.916 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.922 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.941 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.943 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:20:28 np0005604791 nova_compute[226294]: 2026-02-02 10:20:28.943 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:20:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:29.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:29 np0005604791 podman[246619]: 2026-02-02 10:20:29.403965804 +0000 UTC m=+0.081510178 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 05:20:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:30.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:31.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:32.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:33.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.261 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.262 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.262 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.263 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.299 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:33 np0005604791 nova_compute[226294]: 2026-02-02 10:20:33.300 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:34.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:35.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:36.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:37.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:38.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.300 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.347 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:38 np0005604791 nova_compute[226294]: 2026-02-02 10:20:38.348 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:39.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:40.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:41.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:42.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.716891) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642717029, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1019, "num_deletes": 251, "total_data_size": 2329272, "memory_usage": 2352552, "flush_reason": "Manual Compaction"}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642728203, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1012034, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36824, "largest_seqno": 37838, "table_properties": {"data_size": 1008115, "index_size": 1571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10365, "raw_average_key_size": 21, "raw_value_size": 999773, "raw_average_value_size": 2036, "num_data_blocks": 66, "num_entries": 491, "num_filter_entries": 491, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027567, "oldest_key_time": 1770027567, "file_creation_time": 1770027642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 11381 microseconds, and 5805 cpu microseconds.
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.728293) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1012034 bytes OK
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.728325) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731332) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731372) EVENT_LOG_v1 {"time_micros": 1770027642731361, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731405) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2324180, prev total WAL file size 2324180, number of live WAL files 2.
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.732874) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323532' seq:0, type:0; will stop at (end)
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(988KB)], [69(13MB)]
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642732948, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15520030, "oldest_snapshot_seqno": -1}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6544 keys, 11914868 bytes, temperature: kUnknown
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642855491, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11914868, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11874566, "index_size": 22846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 172256, "raw_average_key_size": 26, "raw_value_size": 11759884, "raw_average_value_size": 1797, "num_data_blocks": 894, "num_entries": 6544, "num_filter_entries": 6544, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.855737) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11914868 bytes
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.857132) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.6 rd, 97.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.8 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(27.1) write-amplify(11.8) OK, records in: 7032, records dropped: 488 output_compression: NoCompression
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.857175) EVENT_LOG_v1 {"time_micros": 1770027642857163, "job": 42, "event": "compaction_finished", "compaction_time_micros": 122601, "compaction_time_cpu_micros": 25212, "output_level": 6, "num_output_files": 1, "total_output_size": 11914868, "num_input_records": 7032, "num_output_records": 6544, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642857442, "job": 42, "event": "table_file_deletion", "file_number": 71}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642859199, "job": 42, "event": "table_file_deletion", "file_number": 69}
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.732222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859345) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859359) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:42 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:20:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:43.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.349 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.394 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:43 np0005604791 nova_compute[226294]: 2026-02-02 10:20:43.395 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:20:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:44.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:20:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:20:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:20:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:45.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:46.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:47.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:48.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:48 np0005604791 nova_compute[226294]: 2026-02-02 10:20:48.396 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:20:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:20:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:20:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:20:49 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:20:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:50.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:50 np0005604791 podman[246755]: 2026-02-02 10:20:50.415286684 +0000 UTC m=+0.092190022 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb  2 05:20:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:52.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:53 np0005604791 nova_compute[226294]: 2026-02-02 10:20:53.397 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:20:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:54.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:20:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:20:54 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:20:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:55.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:20:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:20:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:20:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:58.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:20:58 np0005604791 nova_compute[226294]: 2026-02-02 10:20:58.401 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:20:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:20:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:20:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:00 np0005604791 podman[246813]: 2026-02-02 10:21:00.369045714 +0000 UTC m=+0.046153973 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb  2 05:21:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:03.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:03 np0005604791 nova_compute[226294]: 2026-02-02 10:21:03.403 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:04.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:05.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb  2 05:21:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:07.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb  2 05:21:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:08.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:08 np0005604791 nova_compute[226294]: 2026-02-02 10:21:08.406 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:09.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:10.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:11.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:12.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:13.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:13 np0005604791 nova_compute[226294]: 2026-02-02 10:21:13.408 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:14.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:15.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:16.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:17.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:18.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:18 np0005604791 nova_compute[226294]: 2026-02-02 10:21:18.411 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:19.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:19 np0005604791 nova_compute[226294]: 2026-02-02 10:21:19.943 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:20.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:20 np0005604791 nova_compute[226294]: 2026-02-02 10:21:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:20 np0005604791 nova_compute[226294]: 2026-02-02 10:21:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:21.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:21 np0005604791 podman[246870]: 2026-02-02 10:21:21.486745827 +0000 UTC m=+0.155917809 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb  2 05:21:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:22.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:23.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:23 np0005604791 nova_compute[226294]: 2026-02-02 10:21:23.412 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:23 np0005604791 nova_compute[226294]: 2026-02-02 10:21:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:23 np0005604791 nova_compute[226294]: 2026-02-02 10:21:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:21:23 np0005604791 nova_compute[226294]: 2026-02-02 10:21:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:21:23 np0005604791 nova_compute[226294]: 2026-02-02 10:21:23.665 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:21:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:24.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:24 np0005604791 nova_compute[226294]: 2026-02-02 10:21:24.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:25.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:26.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:26 np0005604791 nova_compute[226294]: 2026-02-02 10:21:26.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:26 np0005604791 nova_compute[226294]: 2026-02-02 10:21:26.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:27.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:27 np0005604791 nova_compute[226294]: 2026-02-02 10:21:27.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:27 np0005604791 nova_compute[226294]: 2026-02-02 10:21:27.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:21:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:28.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:28 np0005604791 nova_compute[226294]: 2026-02-02 10:21:28.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:29.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:21:29 np0005604791 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:21:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:21:30 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3243985345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.174 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:21:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:30.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.308 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.310 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4884MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.310 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.311 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.375 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.375 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.391 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:21:30 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:21:30 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/207582716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.846 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.852 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.869 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.871 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:21:30 np0005604791 nova_compute[226294]: 2026-02-02 10:21:30.872 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:21:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:31.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:31 np0005604791 podman[246970]: 2026-02-02 10:21:31.395231729 +0000 UTC m=+0.066272375 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Feb  2 05:21:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:32.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:33.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:33 np0005604791 nova_compute[226294]: 2026-02-02 10:21:33.417 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:34.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:35.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:36.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:37.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:38.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:38 np0005604791 nova_compute[226294]: 2026-02-02 10:21:38.420 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:38 np0005604791 nova_compute[226294]: 2026-02-02 10:21:38.422 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:39.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:40.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:41.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:42.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.423 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.462 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:43 np0005604791 nova_compute[226294]: 2026-02-02 10:21:43.463 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:21:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:44.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.922 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:21:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:21:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:21:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:45.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:46.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:47.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:48.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:48 np0005604791 nova_compute[226294]: 2026-02-02 10:21:48.464 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:49.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:21:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:50.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:51.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:52.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:52 np0005604791 podman[247026]: 2026-02-02 10:21:52.408659693 +0000 UTC m=+0.085027412 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb  2 05:21:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:53.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.466 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.469 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.501 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:53 np0005604791 nova_compute[226294]: 2026-02-02 10:21:53.502 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:21:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:54.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb  2 05:21:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb  2 05:21:55 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb  2 05:21:55 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb  2 05:21:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:21:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:55.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:21:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:21:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:21:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:21:56 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:21:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:56.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:21:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:21:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:58.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:21:58 np0005604791 nova_compute[226294]: 2026-02-02 10:21:58.503 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:58 np0005604791 nova_compute[226294]: 2026-02-02 10:21:58.504 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:21:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:21:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:21:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:00.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:22:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:22:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:02.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:02 np0005604791 podman[247163]: 2026-02-02 10:22:02.387223929 +0000 UTC m=+0.054713670 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb  2 05:22:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.505 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.542 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:03 np0005604791 nova_compute[226294]: 2026-02-02 10:22:03.543 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:04.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:05.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:06.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:07.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:08.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:08 np0005604791 nova_compute[226294]: 2026-02-02 10:22:08.543 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:09.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:10.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:11.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:12.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:13.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:13 np0005604791 nova_compute[226294]: 2026-02-02 10:22:13.546 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:14.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:15.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:16.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:17.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:18.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.548 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.552 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:18 np0005604791 nova_compute[226294]: 2026-02-02 10:22:18.553 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:19.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:20.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:21.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:21 np0005604791 nova_compute[226294]: 2026-02-02 10:22:21.872 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:21 np0005604791 nova_compute[226294]: 2026-02-02 10:22:21.873 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:22.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:22 np0005604791 nova_compute[226294]: 2026-02-02 10:22:22.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:23 np0005604791 podman[247245]: 2026-02-02 10:22:23.283879602 +0000 UTC m=+0.091103142 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb  2 05:22:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:23.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.553 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.554 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:22:23 np0005604791 nova_compute[226294]: 2026-02-02 10:22:23.675 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:22:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:24.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:24 np0005604791 nova_compute[226294]: 2026-02-02 10:22:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:25.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:25 np0005604791 nova_compute[226294]: 2026-02-02 10:22:25.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:26.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:27.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:27 np0005604791 nova_compute[226294]: 2026-02-02 10:22:27.658 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:28.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.555 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.557 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.557 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.558 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.594 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.595 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:28 np0005604791 nova_compute[226294]: 2026-02-02 10:22:28.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:29.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:29 np0005604791 nova_compute[226294]: 2026-02-02 10:22:29.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:29 np0005604791 nova_compute[226294]: 2026-02-02 10:22:29.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:22:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:30.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:31.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.675 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.675 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:22:31 np0005604791 nova_compute[226294]: 2026-02-02 10:22:31.676 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:22:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:22:32 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/689481102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.171 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.299 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.300 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4874MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.300 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.301 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:22:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:32.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.405 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.405 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.419 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:22:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:22:32 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1174471894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.890 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.896 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.917 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.920 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:22:32 np0005604791 nova_compute[226294]: 2026-02-02 10:22:32.921 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:22:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:33.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:33 np0005604791 podman[247322]: 2026-02-02 10:22:33.384798758 +0000 UTC m=+0.060374539 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:22:33 np0005604791 nova_compute[226294]: 2026-02-02 10:22:33.596 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:34.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:35.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:36.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:37.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:38 np0005604791 nova_compute[226294]: 2026-02-02 10:22:38.598 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:39.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:40.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:41.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:42.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:43.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:43 np0005604791 nova_compute[226294]: 2026-02-02 10:22:43.600 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:44.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:22:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.924 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:22:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.924 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:22:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:45.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.604 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.606 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.607 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.607 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:48 np0005604791 nova_compute[226294]: 2026-02-02 10:22:48.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:49.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:51.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:53.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:53 np0005604791 podman[247376]: 2026-02-02 10:22:53.466393744 +0000 UTC m=+0.142899065 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb  2 05:22:53 np0005604791 nova_compute[226294]: 2026-02-02 10:22:53.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:53 np0005604791 nova_compute[226294]: 2026-02-02 10:22:53.635 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:22:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:22:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:55.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:56.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:22:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:57.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:22:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:22:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:58.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.635 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.636 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.636 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.637 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.637 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:22:58 np0005604791 nova_compute[226294]: 2026-02-02 10:22:58.639 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.826787) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778826827, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1568, "num_deletes": 251, "total_data_size": 3859254, "memory_usage": 3913120, "flush_reason": "Manual Compaction"}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778849514, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2515296, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37843, "largest_seqno": 39406, "table_properties": {"data_size": 2508757, "index_size": 3674, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13899, "raw_average_key_size": 19, "raw_value_size": 2495610, "raw_average_value_size": 3590, "num_data_blocks": 160, "num_entries": 695, "num_filter_entries": 695, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027643, "oldest_key_time": 1770027643, "file_creation_time": 1770027778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 22786 microseconds, and 6479 cpu microseconds.
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.849572) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2515296 bytes OK
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.849596) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851735) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851756) EVENT_LOG_v1 {"time_micros": 1770027778851749, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851779) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 3852011, prev total WAL file size 3852011, number of live WAL files 2.
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.852716) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(2456KB)], [72(11MB)]
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778852766, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 14430164, "oldest_snapshot_seqno": -1}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6723 keys, 12201192 bytes, temperature: kUnknown
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778953628, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12201192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12159961, "index_size": 23327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16837, "raw_key_size": 176643, "raw_average_key_size": 26, "raw_value_size": 12042319, "raw_average_value_size": 1791, "num_data_blocks": 910, "num_entries": 6723, "num_filter_entries": 6723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.953988) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12201192 bytes
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.956528) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.9 rd, 120.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 11.4 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(10.6) write-amplify(4.9) OK, records in: 7239, records dropped: 516 output_compression: NoCompression
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.956566) EVENT_LOG_v1 {"time_micros": 1770027778956548, "job": 44, "event": "compaction_finished", "compaction_time_micros": 100960, "compaction_time_cpu_micros": 31418, "output_level": 6, "num_output_files": 1, "total_output_size": 12201192, "num_input_records": 7239, "num_output_records": 6723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778957335, "job": 44, "event": "table_file_deletion", "file_number": 74}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778959478, "job": 44, "event": "table_file_deletion", "file_number": 72}
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.852574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:58 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:22:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:22:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:22:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:59.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:00.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:01.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:01 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:23:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:02.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:23:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:23:02 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:23:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:03.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:03 np0005604791 nova_compute[226294]: 2026-02-02 10:23:03.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:04 np0005604791 podman[247516]: 2026-02-02 10:23:04.410090953 +0000 UTC m=+0.086244613 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb  2 05:23:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:05.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:06.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:07.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:08.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:08 np0005604791 nova_compute[226294]: 2026-02-02 10:23:08.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:08 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:23:08 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:23:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:09.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:10.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:12.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:13 np0005604791 nova_compute[226294]: 2026-02-02 10:23:13.642 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:14.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:16.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:18.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:18 np0005604791 nova_compute[226294]: 2026-02-02 10:23:18.643 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:19.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:20.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:22.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:22 np0005604791 nova_compute[226294]: 2026-02-02 10:23:22.922 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:22 np0005604791 nova_compute[226294]: 2026-02-02 10:23:22.923 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:23.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:23 np0005604791 nova_compute[226294]: 2026-02-02 10:23:23.646 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:24 np0005604791 podman[247596]: 2026-02-02 10:23:24.441801866 +0000 UTC m=+0.109861042 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb  2 05:23:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:24.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:24 np0005604791 nova_compute[226294]: 2026-02-02 10:23:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:24 np0005604791 nova_compute[226294]: 2026-02-02 10:23:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:25.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.665 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:25 np0005604791 nova_compute[226294]: 2026-02-02 10:23:25.666 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Feb  2 05:23:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:26.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:26 np0005604791 nova_compute[226294]: 2026-02-02 10:23:26.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:27.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:28.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:28 np0005604791 nova_compute[226294]: 2026-02-02 10:23:28.648 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:28 np0005604791 nova_compute[226294]: 2026-02-02 10:23:28.658 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:29.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:30.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:30 np0005604791 nova_compute[226294]: 2026-02-02 10:23:30.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:31.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:31 np0005604791 nova_compute[226294]: 2026-02-02 10:23:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:31 np0005604791 nova_compute[226294]: 2026-02-02 10:23:31.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:23:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:32.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.672 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.672 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:23:32 np0005604791 nova_compute[226294]: 2026-02-02 10:23:32.673 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:23:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:33 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:23:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2284638192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.175 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.328 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.329 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4876MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.330 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.330 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.396 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.397 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.411 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:23:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:33.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:33 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:23:33 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/761614579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.846 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.850 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.866 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.868 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:23:33 np0005604791 nova_compute[226294]: 2026-02-02 10:23:33.869 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:23:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:34.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:35 np0005604791 podman[247673]: 2026-02-02 10:23:35.450325906 +0000 UTC m=+0.066521007 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb  2 05:23:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:36.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:37.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:38.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:38 np0005604791 nova_compute[226294]: 2026-02-02 10:23:38.652 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:38 np0005604791 nova_compute[226294]: 2026-02-02 10:23:38.658 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:39.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:39 np0005604791 nova_compute[226294]: 2026-02-02 10:23:39.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:23:39 np0005604791 nova_compute[226294]: 2026-02-02 10:23:39.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Feb  2 05:23:39 np0005604791 nova_compute[226294]: 2026-02-02 10:23:39.664 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Feb  2 05:23:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:40.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:41.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:42.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:43.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:43 np0005604791 nova_compute[226294]: 2026-02-02 10:23:43.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:43 np0005604791 nova_compute[226294]: 2026-02-02 10:23:43.659 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:44.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.925 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:23:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:23:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:23:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:45.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:46.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:47.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:48.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:48 np0005604791 nova_compute[226294]: 2026-02-02 10:23:48.657 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:48 np0005604791 nova_compute[226294]: 2026-02-02 10:23:48.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:49.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:50.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:23:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:23:52 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:52 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:52 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:52.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:52 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:53 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:53 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:53 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:53.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:53 np0005604791 nova_compute[226294]: 2026-02-02 10:23:53.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:54 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:54 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:23:54 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:54.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:23:55 np0005604791 podman[247727]: 2026-02-02 10:23:55.431507539 +0000 UTC m=+0.107515531 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb  2 05:23:55 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:55 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:55 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:55.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:56 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:56 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:56 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:56.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:57 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:57 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:57 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:57.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:57 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:23:58 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:58 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:58 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:58.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:23:58 np0005604791 nova_compute[226294]: 2026-02-02 10:23:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:58 np0005604791 nova_compute[226294]: 2026-02-02 10:23:58.663 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:23:59 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:23:59 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:23:59 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:59.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:00 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:00 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:00 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:00.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:01 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:01 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:01 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:01.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:02 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:02 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:02 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:02.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:02 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:03 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:03 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:03 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:03.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:03 np0005604791 nova_compute[226294]: 2026-02-02 10:24:03.663 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:04 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:04 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:04 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:04.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:05 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:05 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:05 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:05.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:06 np0005604791 podman[247785]: 2026-02-02 10:24:06.398370375 +0000 UTC m=+0.066807064 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb  2 05:24:06 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:06 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:06 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:06.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:07 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:07 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:07 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:07.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:07 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:08 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:08 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:08 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:08.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:08 np0005604791 nova_compute[226294]: 2026-02-02 10:24:08.665 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:24:08 np0005604791 nova_compute[226294]: 2026-02-02 10:24:08.668 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:24:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb  2 05:24:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb  2 05:24:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:24:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:24:09 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb  2 05:24:09 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:09 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:09 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:10 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:10 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:10 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:10.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:11 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:11 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:11 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:11.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:12 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:12 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:12 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:12.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:12 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:13 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:13 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:24:13 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:13.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:24:13 np0005604791 nova_compute[226294]: 2026-02-02 10:24:13.667 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:13 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:24:13 np0005604791 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb  2 05:24:14 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:14 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:14 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:14.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:15 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:15 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:15 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:15.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:16 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:16 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:16 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:17 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:17 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:17 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:17.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:17 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:18 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:18 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:18 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:18.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.669 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:18 np0005604791 nova_compute[226294]: 2026-02-02 10:24:18.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:24:19 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:19 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:24:19 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:19.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:24:20 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:20 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:20 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:20.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:21 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:21 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:21 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:21.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:22 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:22 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:22 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:22.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:22 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:23 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:23 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:23 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:23 np0005604791 nova_compute[226294]: 2026-02-02 10:24:23.664 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:23 np0005604791 nova_compute[226294]: 2026-02-02 10:24:23.709 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:24 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:24 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:24 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:24 np0005604791 nova_compute[226294]: 2026-02-02 10:24:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:24 np0005604791 nova_compute[226294]: 2026-02-02 10:24:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:25 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:25 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:24:25 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:25.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:24:25 np0005604791 nova_compute[226294]: 2026-02-02 10:24:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:26 np0005604791 podman[247946]: 2026-02-02 10:24:26.419065672 +0000 UTC m=+0.090645961 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb  2 05:24:26 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:26 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:26 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:26 np0005604791 nova_compute[226294]: 2026-02-02 10:24:26.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:26 np0005604791 nova_compute[226294]: 2026-02-02 10:24:26.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Feb  2 05:24:26 np0005604791 nova_compute[226294]: 2026-02-02 10:24:26.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Feb  2 05:24:26 np0005604791 nova_compute[226294]: 2026-02-02 10:24:26.686 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Feb  2 05:24:27 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:27 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:27 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:27.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:27 np0005604791 nova_compute[226294]: 2026-02-02 10:24:27.682 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:27 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:27 np0005604791 systemd-logind[805]: New session 58 of user zuul.
Feb  2 05:24:27 np0005604791 systemd[1]: Started Session 58 of User zuul.
Feb  2 05:24:28 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:28 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:28 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:28.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.671 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Feb  2 05:24:28 np0005604791 nova_compute[226294]: 2026-02-02 10:24:28.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:29 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:29 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:29 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:29.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:30 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:30 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:30 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:30.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:31 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb  2 05:24:31 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1495132194' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb  2 05:24:31 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:31 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:31 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:31.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:31 np0005604791 nova_compute[226294]: 2026-02-02 10:24:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:31 np0005604791 nova_compute[226294]: 2026-02-02 10:24:31.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Feb  2 05:24:32 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:32 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:32 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:32.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:32 np0005604791 nova_compute[226294]: 2026-02-02 10:24:32.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:32 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:33 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:33 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:24:33 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:33.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.679 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:24:33 np0005604791 nova_compute[226294]: 2026-02-02 10:24:33.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:33 np0005604791 ovs-vsctl[248329]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb  2 05:24:34 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:24:34 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2759038895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.100 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.236 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.237 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4712MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.237 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.238 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.412 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.412 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.522 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Feb  2 05:24:34 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb  2 05:24:34 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:34 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:34 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:34.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:34 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb  2 05:24:34 np0005604791 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.616 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.617 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.644 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.669 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Feb  2 05:24:34 np0005604791 nova_compute[226294]: 2026-02-02 10:24:34.719 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: cache status {prefix=cache status} (starting...)
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:35 np0005604791 lvm[248680]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb  2 05:24:35 np0005604791 lvm[248680]: VG ceph_vg0 finished
Feb  2 05:24:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb  2 05:24:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1419073997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb  2 05:24:35 np0005604791 nova_compute[226294]: 2026-02-02 10:24:35.217 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Feb  2 05:24:35 np0005604791 nova_compute[226294]: 2026-02-02 10:24:35.225 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: client ls {prefix=client ls} (starting...)
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:35 np0005604791 nova_compute[226294]: 2026-02-02 10:24:35.244 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Feb  2 05:24:35 np0005604791 nova_compute[226294]: 2026-02-02 10:24:35.246 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Feb  2 05:24:35 np0005604791 nova_compute[226294]: 2026-02-02 10:24:35.246 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:24:35 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:35 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:35 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:35.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: damage ls {prefix=damage ls} (starting...)
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:35 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Feb  2 05:24:35 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/936668337' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump loads {prefix=dump loads} (starting...)
Feb  2 05:24:35 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb  2 05:24:36 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/663724622' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:36 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:36 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:36.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:36 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Feb  2 05:24:36 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1010756422' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb  2 05:24:36 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:37 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: ops {prefix=ops} (starting...)
Feb  2 05:24:37 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2333568486' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3989197214' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb  2 05:24:37 np0005604791 podman[248991]: 2026-02-02 10:24:37.41778746 +0000 UTC m=+0.087508497 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb  2 05:24:37 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:37 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:37 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:37.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:37 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: session ls {prefix=session ls} (starting...)
Feb  2 05:24:37 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb  2 05:24:37 np0005604791 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: status {prefix=status} (starting...)
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb  2 05:24:37 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3120345489' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/703134171' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1165547802' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb  2 05:24:38 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:38 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:38 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:38 np0005604791 nova_compute[226294]: 2026-02-02 10:24:38.715 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/73952731' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Feb  2 05:24:38 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/512297101' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2629942829' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb  2 05:24:39 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:39 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:39 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:39.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/888108608' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb  2 05:24:39 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3061187504' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3424334972' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1690255577' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb  2 05:24:40 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:40 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:40 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:40.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb  2 05:24:40 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1744096391' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321e000 session 0x5616e34dbc20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e0d17860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.825603485s of 16.828807831s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963854 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.089159966s of 12.096708298s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963263 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3283c00 session 0x5616e3750f00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cafc00 session 0x5616e3750960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542000 session 0x5616e311a000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226ac00 session 0x5616e2797860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.016975403s of 57.231136322s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964907 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966419 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.142086029s of 13.158586502s, submitted: 4
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb  2 05:24:41 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1580771436' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e278a000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281c00 session 0x5616e312c000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226c400 session 0x5616e311fe00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9800 session 0x5616e311a3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.639591217s of 18.651557922s, submitted: 3
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.387768745s of 10.394624710s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964646 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964514 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21e4000 session 0x5616e311af00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281000 session 0x5616e312ef00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.559337616s of 33.630935669s, submitted: 4
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.959052086s of 15.965865135s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0541c00 session 0x5616e0d6e1e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3222000 session 0x5616e311b0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193000 session 0x5616e312c960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 53.633445740s of 53.641864777s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.074842453s of 12.082664490s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966947 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321a800 session 0x5616e311a780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3220400 session 0x5616e34ea5a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.421203613s of 22.434373856s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968459 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.030465126s of 15.042689323s, submitted: 3
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327a000 session 0x5616e3750960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.958620071s of 10.963719368s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.377700806s of 13.986701965s, submitted: 4
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3282400 session 0x5616e311a3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.270744324s of 48.274307251s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971813 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.719709396s of 17.732963562s, submitted: 4
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21adc00 session 0x5616e311b860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2caf800 session 0x5616e337f2c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.156009674s of 34.160236359s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 8815 writes, 34K keys, 8815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 8815 writes, 1876 syncs, 4.70 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 746 writes, 1209 keys, 746 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 746 writes, 348 syncs, 2.14 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193c00 session 0x5616e311bc20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3285400 session 0x5616e311a780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.711874962s of 15.720390320s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.441757202s of 16.448879242s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4800 session 0x5616e3688780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10efc00 session 0x5616e312ef00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.622116089s of 49.629035950s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 3768320 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 3612672 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread fragmentation_score=0.000029 took=0.000037s
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980213165s of 18.280017853s, submitted: 343
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21ac400 session 0x5616e3688000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.939346313s of 41.943122864s, submitted: 1
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973985 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.912214279s of 17.978521347s, submitted: 3
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 977619 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 94093312 unmapped: 3465216 heap: 97558528 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 141 ms_handle_reset con 0x5616e327c800 session 0x5616e1f7b0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85827584 unmapped: 20127744 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb65f000/0x0/0x4ffc00000, data 0x10f66e2/0x11ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85868544 unmapped: 20086784 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 142 ms_handle_reset con 0x5616e21e4c00 session 0x5616e37ce780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141871 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fae59000/0x0/0x4ffc00000, data 0x18fa8f2/0x19b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e108a800 session 0x5616e3688d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e3280c00 session 0x5616e36965a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.927474976s of 40.098861694s, submitted: 30
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147855 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148527 data_alloc: 218103808 data_used: 143360
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e226a400 session 0x5616e1eb4780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 145 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 146 ms_handle_reset con 0x5616e2cac000 session 0x5616e34770e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102490112 unmapped: 3465216 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 3440640 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212945 data_alloc: 234881024 data_used: 13774848
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.954066277s of 12.084420204s, submitted: 24
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fad62000/0x0/0x4ffc00000, data 0x19eeaf0/0x1aa8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213508 data_alloc: 234881024 data_used: 13774848
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 2940928 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327c400 session 0x5616e34eb2c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222400 session 0x5616e3378780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.370294571s of 15.401467323s, submitted: 11
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 2965504 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106307584 unmapped: 2793472 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281178 data_alloc: 234881024 data_used: 14733312
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f944e000/0x0/0x4ffc00000, data 0x215bac2/0x2216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276638 data_alloc: 234881024 data_used: 14733312
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.930480957s of 10.239095688s, submitted: 95
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277223 data_alloc: 234881024 data_used: 14733312
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277072 data_alloc: 234881024 data_used: 14733312
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f942c000/0x0/0x4ffc00000, data 0x2185ac2/0x2240000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e1f04960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d800 session 0x5616e1f054a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e1f05e00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283800 session 0x5616e1f04780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 1802240 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.226655960s of 12.266713142s, submitted: 7
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321f000 session 0x5616e337e000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108953600 unmapped: 3293184 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e337f860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293665 data_alloc: 234881024 data_used: 15781888
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e0d161e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293681 data_alloc: 234881024 data_used: 15781888
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e1f7b2c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109043712 unmapped: 3203072 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034008980s of 10.189837456s, submitted: 37
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 3063808 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1294918 data_alloc: 234881024 data_used: 15740928
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1295070 data_alloc: 234881024 data_used: 15749120
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c800 session 0x5616e3750d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e36892c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108748800 unmapped: 3497984 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92c9000/0x0/0x4ffc00000, data 0x22e7b24/0x23a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.504937172s of 10.520147324s, submitted: 3
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 5611520 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341628 data_alloc: 234881024 data_used: 15806464
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109731840 unmapped: 4612096 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29abb24/0x2a67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e27985a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542c00 session 0x5616e2797860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352434 data_alloc: 234881024 data_used: 15826944
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10eec00 session 0x5616e311e1e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542000 session 0x5616e2620000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.650221825s of 10.002939224s, submitted: 115
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617400 session 0x5616e337e3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286162 data_alloc: 234881024 data_used: 15585280
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9422000/0x0/0x4ffc00000, data 0x218eac2/0x2249000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286522 data_alloc: 234881024 data_used: 15585280
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475a40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e3751860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212351 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.967247963s of 12.068682671s, submitted: 26
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215375 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.472117424s of 15.489373207s, submitted: 5
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327e000 session 0x5616e2797c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e2d70d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220800 session 0x5616e312f4a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e337ef00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109068288 unmapped: 22077440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114139136 unmapped: 17006592 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118480896 unmapped: 12664832 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.837284088s of 18.897920609s, submitted: 9
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 12107776 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119062528 unmapped: 12083200 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f901d000/0x0/0x4ffc00000, data 0x2594ac2/0x264f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1397161 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1396570 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.991930008s of 12.083539009s, submitted: 22
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394635 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475e00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e0ffeb40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.675941467s of 17.711282730s, submitted: 2
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222c00 session 0x5616e3689a40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e34eaf00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ead20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e278ad20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 16252928 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e3408000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e36914a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.119201660s of 29.155227661s, submitted: 11
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99c00 session 0x5616e311b0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 20881408 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e10e7c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e10e63c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e0572780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e1f7a1e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ad2/0x19be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4000 session 0x5616e0d17860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e37cef00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ea780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.900746346s of 15.997513771s, submitted: 18
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113655808 unmapped: 21168128 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367081 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.825149536s of 17.985601425s, submitted: 37
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366022 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e3696d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616df7d8780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e311f860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616dfb71c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e0d161e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e1f054a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e311a960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616e37ced20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e0d6ed20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114393088 unmapped: 20430848 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c38000/0x0/0x4ffc00000, data 0x2976b57/0x2a34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413608 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.185551643s of 10.350721359s, submitted: 46
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34db4a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417109 data_alloc: 234881024 data_used: 17113088
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114868224 unmapped: 19955712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 18014208 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.333662033s of 12.362901688s, submitted: 6
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 13238272 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 12673024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123871232 unmapped: 10952704 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f833b000/0x0/0x4ffc00000, data 0x3272b7a/0x3331000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1524459 data_alloc: 234881024 data_used: 21643264
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 10911744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3280000 session 0x5616e2cd8b40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e311e960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123445248 unmapped: 11378688 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caf800 session 0x5616e2cd8780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 15237120 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e34061e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954436302s of 15.854330063s, submitted: 169
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e311a5a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e1f052c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.149909973s of 20.260728836s, submitted: 33
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c400 session 0x5616e04dd680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caec00 session 0x5616e2797860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e263c960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e359af00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e04dc1e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250106 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.556289673s of 17.624111176s, submitted: 16
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255618 data_alloc: 234881024 data_used: 10326016
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 17833984 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b34000/0x0/0x4ffc00000, data 0x1a7cac2/0x1b37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d6000 session 0x5616e0f6e000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321a000 session 0x5616e337e5a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e2d70000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e34741e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.186046600s of 29.217700958s, submitted: 7
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258580 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e313e780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310128 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e3476780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3543400 session 0x5616e33d3e00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e33d3a40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.488653183s of 10.665133476s, submitted: 14
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e33d23c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315916 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 23068672 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.383611679s of 12.435736656s, submitted: 16
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 16957440 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122388480 unmapped: 16113664 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1437520 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122970112 unmapped: 15532032 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435392 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b90000/0x0/0x4ffc00000, data 0x2a1faf5/0x2adc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.516505241s of 13.009800911s, submitted: 89
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 16072704 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.440578461s of 19.455564499s, submitted: 4
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e312e960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e37ceb40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e34065a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.921215057s of 17.030107498s, submitted: 36
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e3406d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542400 session 0x5616e34772c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s#012Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.137660980s of 19.212322235s, submitted: 21
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e312fe00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282483 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34734a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10efc00 session 0x5616e34ea3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e2d70780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e1f05c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116129792 unmapped: 22372352 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116170752 unmapped: 22331392 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2255c00 session 0x5616e2d71680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e3476f00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312446 data_alloc: 234881024 data_used: 14000128
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.036809921s of 10.113059044s, submitted: 15
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3285000 session 0x5616e36881e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f1000/0x0/0x4ffc00000, data 0x1cafad2/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258361 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e33d2780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3494000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e3691c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e34ea1e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e337f0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e0fffc20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e05730e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e2cd83c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e05a2780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5400 session 0x5616dfb74b40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e311b680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346877 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d84000/0x0/0x4ffc00000, data 0x241cad2/0x24d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e3476000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282c00 session 0x5616e34761e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321ec00 session 0x5616e359ab40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.106111526s of 13.270271301s, submitted: 38
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5800 session 0x5616e33d2f00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 29736960 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348691 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d83000/0x0/0x4ffc00000, data 0x241cae2/0x24d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3404780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e04dc960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cae400 session 0x5616e311a000
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.711370468s of 20.742525101s, submitted: 12
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327fc00 session 0x5616e312d680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298342 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541400 session 0x5616dfb705a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541000 session 0x5616dfb703c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f961c000/0x0/0x4ffc00000, data 0x1b85ac2/0x1c40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9800 session 0x5616e311b4a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.859712601s of 14.898717880s, submitted: 17
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e04dde00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301202 data_alloc: 234881024 data_used: 10452992
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116490240 unmapped: 29360128 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308194 data_alloc: 234881024 data_used: 11509760
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116695040 unmapped: 29155328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116867072 unmapped: 28983296 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116817920 unmapped: 29032448 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.749152184s of 11.151672363s, submitted: 349
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311098 data_alloc: 234881024 data_used: 11579392
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119087104 unmapped: 26763264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422902 data_alloc: 234881024 data_used: 11780096
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414566 data_alloc: 234881024 data_used: 11788288
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.005815506s of 12.341868401s, submitted: 120
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8903000/0x0/0x4ffc00000, data 0x289eac2/0x2959000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e04dd680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e3477680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616dfb634a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618800 session 0x5616e312d0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616e312fc20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e34725a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e312c780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 26427392 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.763683319s of 20.829357147s, submitted: 19
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e3477c20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9c00 session 0x5616e05732c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 26140672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1324070 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7000 session 0x5616e04dc960
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 25829376 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 26042368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330608 data_alloc: 234881024 data_used: 11010048
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356296 data_alloc: 234881024 data_used: 14831616
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 24354816 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.424659729s of 16.518987656s, submitted: 14
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3281400 session 0x5616e0d6ed20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123994112 unmapped: 21856256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453094 data_alloc: 234881024 data_used: 15106048
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124026880 unmapped: 21823488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87f4000/0x0/0x4ffc00000, data 0x29abac2/0x2a66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1459086 data_alloc: 234881024 data_used: 15007744
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x29b3ac2/0x2a6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e0cdb0e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e278a780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 22183936 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2616400 session 0x5616e36910e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123813888 unmapped: 22036480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.643545151s of 19.996114731s, submitted: 90
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124805120 unmapped: 21045248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1548442 data_alloc: 234881024 data_used: 16363520
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 21012480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d82000/0x0/0x4ffc00000, data 0x3417ac2/0x34d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 21381120 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cac000 session 0x5616e1f04b40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542800 session 0x5616e1ec81e0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555186 data_alloc: 234881024 data_used: 16371712
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123117568 unmapped: 22732800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d01000/0x0/0x4ffc00000, data 0x34a0ac2/0x355b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555554 data_alloc: 234881024 data_used: 16371712
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.561103821s of 10.734168053s, submitted: 73
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555386 data_alloc: 234881024 data_used: 16371712
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555642 data_alloc: 234881024 data_used: 16371712
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.766269684s of 10.206396103s, submitted: 5
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e312c3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e34734a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321b000 session 0x5616e0cdb680
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e04dde00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f04780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439208 data_alloc: 234881024 data_used: 15007744
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616e311b860
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: mgrc ms_handle_reset ms_handle_reset con 0x5616dff2cc00
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1282799344
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: mgrc handle_mgr_configure stats_period=5
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7c00 session 0x5616e312cd20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226a400 session 0x5616e1f04d20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616dfb70b40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f7a780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.408138275s of 34.488780975s, submitted: 26
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e34eba40
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 24403968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f400 session 0x5616e312c780
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 24395776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.755592346s of 17.786962509s, submitted: 13
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 20889600 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441767 data_alloc: 234881024 data_used: 13156352
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a1a000/0x0/0x4ffc00000, data 0x2786ac2/0x2841000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434471 data_alloc: 234881024 data_used: 13156352
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a18000/0x0/0x4ffc00000, data 0x2789ac2/0x2844000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e312c3c0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c400 session 0x5616e312cd20
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.164107323s of 11.421627998s, submitted: 102
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 23257088 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e359a5a0
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 24305664 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf dump' '{prefix=perf dump}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf schema' '{prefix=perf schema}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 24182784 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 12K writes, 3443 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1534 writes, 5165 keys, 1534 commit groups, 1.0 writes per commit group, ingest: 4.83 MB, 0.01 MB/s#012Interval WAL: 1534 writes, 631 syncs, 2.43 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 439.931579590s of 439.972412109s, submitted: 10
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,0,2])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121946112 unmapped: 23904256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1,1])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122183680 unmapped: 23666688 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 24207360 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb  2 05:24:41 np0005604791 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb  2 05:24:41 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:41 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:41 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:41.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:41 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb  2 05:24:41 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2115974441' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb  2 05:24:41 np0005604791 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb  2 05:24:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb  2 05:24:42 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2759989341' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb  2 05:24:42 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:42 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:42 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:42.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:42 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb  2 05:24:42 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1968831604' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.044799) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883044858, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1393, "num_deletes": 258, "total_data_size": 3180907, "memory_usage": 3227664, "flush_reason": "Manual Compaction"}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883064766, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2070865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39411, "largest_seqno": 40799, "table_properties": {"data_size": 2064676, "index_size": 3328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14724, "raw_average_key_size": 20, "raw_value_size": 2051611, "raw_average_value_size": 2869, "num_data_blocks": 145, "num_entries": 715, "num_filter_entries": 715, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027779, "oldest_key_time": 1770027779, "file_creation_time": 1770027883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 20017 microseconds, and 3350 cpu microseconds.
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.064819) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2070865 bytes OK
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.064842) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067221) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067241) EVENT_LOG_v1 {"time_micros": 1770027883067235, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067260) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3173994, prev total WAL file size 3173994, number of live WAL files 2.
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067940) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303031' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2022KB)], [75(11MB)]
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883067972, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 14272057, "oldest_snapshot_seqno": -1}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6908 keys, 14141304 bytes, temperature: kUnknown
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883186699, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14141304, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14096790, "index_size": 26162, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17285, "raw_key_size": 182070, "raw_average_key_size": 26, "raw_value_size": 13973765, "raw_average_value_size": 2022, "num_data_blocks": 1027, "num_entries": 6908, "num_filter_entries": 6908, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.186898) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14141304 bytes
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.226016) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.1 rd, 119.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.6 +0.0 blob) out(13.5 +0.0 blob), read-write-amplify(13.7) write-amplify(6.8) OK, records in: 7438, records dropped: 530 output_compression: NoCompression
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.226060) EVENT_LOG_v1 {"time_micros": 1770027883226044, "job": 46, "event": "compaction_finished", "compaction_time_micros": 118788, "compaction_time_cpu_micros": 20554, "output_level": 6, "num_output_files": 1, "total_output_size": 14141304, "num_input_records": 7438, "num_output_records": 6908, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883226401, "job": 46, "event": "table_file_deletion", "file_number": 77}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883227566, "job": 46, "event": "table_file_deletion", "file_number": 75}
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb  2 05:24:43 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4167825202' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb  2 05:24:43 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:43 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:43 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:43.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:43 np0005604791 nova_compute[226294]: 2026-02-02 10:24:43.716 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Feb  2 05:24:44 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3391420026' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb  2 05:24:44 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:44 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:44 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:44.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:44 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb  2 05:24:44 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3346937992' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb  2 05:24:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Feb  2 05:24:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.927 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Feb  2 05:24:44 np0005604791 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.927 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/117477717' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/236570362' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3903962935' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb  2 05:24:45 np0005604791 systemd[1]: Starting Hostname Service...
Feb  2 05:24:45 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:45 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:45 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:45.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:45 np0005604791 systemd[1]: Started Hostname Service.
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3038498722' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb  2 05:24:45 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3318002558' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/951116633' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1216490102' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb  2 05:24:46 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:46 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:46 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:46.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1937781476' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb  2 05:24:46 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/812017344' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1828341276' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1322837347' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb  2 05:24:47 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:47 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:47 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:47.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100496108' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3084816001' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb  2 05:24:47 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb  2 05:24:48 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb  2 05:24:48 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657262728' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb  2 05:24:48 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:48 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:48 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:48.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:48 np0005604791 nova_compute[226294]: 2026-02-02 10:24:48.718 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Feb  2 05:24:49 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb  2 05:24:49 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2380801877' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb  2 05:24:49 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:49 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb  2 05:24:49 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:49.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1466464595' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2157195115' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb  2 05:24:50 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:50 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb  2 05:24:50 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:50.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb  2 05:24:50 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3165818654' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb  2 05:24:51 np0005604791 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb  2 05:24:51 np0005604791 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb  2 05:24:51 np0005604791 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:51.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb  2 05:24:51 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb  2 05:24:51 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb  2 05:24:51 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb  2 05:24:51 np0005604791 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
